Sepehr Assadi:
Keynote: Graph Coloring, Palette Sparsification, and Beyond
Graph coloring is a central problem in graph theory and has numerous applications in diverse areas of computer science. An important and well-studied case of graph coloring problems is the Δ+1 (vertex) coloring problem where Δ is the maximum degree of the graph. Not only does every graph admit a Δ+1 coloring, but in fact we can find one quite easily in linear time and space via a greedy algorithm. But are there more efficient algorithms for Δ+1 coloring that can process massive graphs that even this algorithm cannot handle? This talk overviews recent results that answer this question in affirmative across a variety of models dedicated to processing massive graphs – streaming, sublinear-time, massively parallel computation, distributed communication, etc. – via a single unified approach: Palette Sparsification.
About the speaker:
Sepehr Assadi is an Assistant Professor of Computer Science at Rutgers University. His primary research interests are in theoretical foundations of processing massive datasets and in particular streaming and sublinear algorithms and lower bounds for massive graph problems. He received a Ph.D. in Computer Science from University of Pennsylvania in 2018 and spent a year as a postdoctoral researcher at Princeton University, before joining Rutgers. Sepehr is a recipient of NSF CAREER award, Google Research Scholar award, EATCS Distinguished Dissertation Award, ACM-EATCS Principles of Distributed Computing Dissertation Award, and several best paper awards at theoretical computer science conferences including SODA, SPAA, and DISC.
Roberto Baldoni:
Keynote: Managing the Cyber Risk in a Decoupled World: does this bring potential opportunities in computer science?
The last thirty years witnessed the growth of both globalization and digital transformation, characterized by information systems becoming interconnected and distributed on a worldwide scale with IT aimed to become a commodity. Cloud computing and blockchain being examples of such robust and distributed technologies which have been the main driver of this globalization process. Global technologies and infrastructures paved the way to organic and highly frequent interactions between millions of companies and organizations in multiple countries almost irrespective of geopolitical implications establishing global and complex interconnected supply chains whose aim was mainly keeping software/devices costs low. This created a virtuous loop that generated an exponential increase of countries’ digitalization process and globalized industries. The likely trends of the next few years will be a progressive decoupling of supply chains particularly for all software/hardware manufacturing employed into vital services of a nation. This will be a long and non-economically neutral process that will bring in a medium term towards the composition of “friendshoring” or “almost domestic” supply chains where developing robust technologies and algorithms compliant to society values. This is expected to increase the number, the magnitude and complexity of cyber attacks coming from other geopolitical blocks for espionage or terroristic reasons in a continuous hybrid warfare scenario. Computer scientists and engineers will have to cope with the new challenges within this decoupled world. The keynote will be an attempt to shed some light on what this could imply in terms of technology, computing paradigms and nation IT capability.
About the speaker:
Roberto Baldoni is the first Director General of the National Cybersecurity Agency of Italy appointed by Prime Minister Draghi. He has been the Deputy Director General of the Italian Intelligence department for four years. Before that he spent fifteen years as a full professor of Computer Science at La Sapienza University of Rome. Roberto chaired the steering committees of DISC and DSN conferences. His research interests were mainly focused on dependable and secure distributed systems.
Jennifer Welch:
Keynote: Using Linearizable Objects in Randomized Concurrent Programs
Atomic shared objects, whose operations take place instantaneously, are a powerful technique for designing complex concurrent programs. Since they are not always available, they are typically substituted with software implementations. A prominent condition relating these implementations to their atomic specifications is linearizability, which preserves safety properties of programs using them. However linearizability does not preserve hyper-properties, which include probabilistic guarantees about randomized programs. A more restrictive property, strong linearizability, does preserve hyper-properties but it is impossible to achieve in many situations. In particular, we show that there are no strongly linearizable implementations of multi-writer registers or snapshot objects in message-passing systems. On the other hand, we show that a wide class of linearizable implementations, including well-known ones for registers and snapshots, can be modified to approximate the probabilistic guarantees of randomized programs when using atomic objects.
About the speaker:
Jennifer L. Welch received her S.M. and Ph.D. from the Massachusetts Institute of Technology and her B.A. from the University of Texas at Austin. She is emeritus professor in the Department of Computer Science and Engineering at Texas A&M University, and is an ACM Distinguished Member. Her research interests are in the theory of distributed computing, especially dynamic networks and distributed data structures.