LAAS Research and Technology Days

13/07/2017 - 11:00

Smart Cities are an example of Cyber-Physical Systems whose goals include improvements in transportation, energy distribution, emergency response, and infrastructure maintenance, to name a few. One of the key elements of a Smart City is the ability to monitor and dynamically allocate its resources. The availability of large amounts of data, ubiquitous wireless connectivity, and the critical need for scalability open the door for new control and optimization methods which are both data-driven and event-driven. The talk will present such an optimization framework and its properties. It will then describe several applications that arise in Smart Cities, some of which have been tested in the City of Boston: a “Smart Parking” system which dynamically assigns and reserves an optimal parking space for a user (driver); the “Street Bump” system which uses standard smartphone capabilities to collect roadway obstacle data and identify and classify them for efficient maintenance and repair; adaptive traffic light control; optimal control of connected autonomous vehicles. Lastly, the talk will describe how a large traffic data set from the Massachusetts road network was analyzed to estimate the Price of Anarchy in comparing “selfish” user-centric behavior to “social” system-centric optimal traffic routing solutions.


03/06/2016 - 14:00
A microgrid is an electrical distribution network consisted of distributed generators, local loads, and energy storage systems that can operate in grid-connected or islanded modes. Different technologies are combined together, such us power converters, control, communications, optimization, and so on. This way the energy can be generated and stored near to the consumption points, improving the stability and reducing the losses produced by the large power lines. In distributed energy systems like microgrids, multi-agent systems technologies will be presented distributed control is a powerful tool for distributed. Previous experiences in the Danish electrical system like the Cell Controller project used these technologies to balance dispersed energy generation and consumption. Other examples of real sites including conventional islanded systems installed in islands and rural remote areas will be shown. Finally, low-voltage distribution systems and DC microgrids for residential applications and homes will be introduced. New worldwide projects to develop technologies for low voltage DC distribution systems will be shown.
25/03/2016 - 11:00
The japanese space mission Hayabusa was the first Sample & Return mission from a Near Earth Asteroid. Launched on 9 May 2003 from the japanese Kagoshima Space Center, Hayabusa travelled during two years to reach the Itokawa asteroid. After some proximity studies, Hayabusalanded on the surface and could collect some dust from the surface, overcoming some technological problems. The sampled dust returned on Earth ground in 2010. Professor Junichiro Kawaguchi, Hayabusa's Project Manager, will presents some technological and scientific aspects of this mission. The Hayabusa mission had a high engineering importance for Jaxa in order to test new technologies in the fields of ion engine, close movement around objects with low gravity, autonomous and optical navigation. From the mission analysis to the spacecraft rendez-vous powered by recently developed ionic thrusters, technological and computational challenges were overcome. In fact, the way of combining low thrusts and gravitational assistance to change orbits or ballistic reentry from interplanetary orbit were of great interest. Furthermore, the design of the touch down and the sample recovering were some difficult problems for which new solutions had to be invented. All these technological achievements served the scientific aspects of Hayabusa mission. Scientists' understanding of asteroids has been greatly improved by studying meteorite samples. However, matching up meteorite samples collected on earth with the exact asteroids from which they came is an issue. Hayabusa has helped to overcome this setback by bringing back pristine samples from a specific, well-characterized asteroid. In other words, Hayabusa bridged the gap between ground observation of asteroids and laboratory analysis of meteorite and cosmic dust collections.
26/03/2015 - 17:00
Flight critical subsystems in aerospace vehicles must achieve probability of failure rates of less than 1 failure in 10^9 flight hours (i.e. less than 1 failure per 100,000 years of operation). Systems that achieve this level of reliability are hard to design, hard to verify, and hard to validate, especially if software is involved. In this talk, I will talk about some of the challenges that the aerospace community faces in designing systems with this level of reliability and how it is (beginning) to approach them. We leverage results from the computer science, robotics and controls communities to develop methods and tools for designing control policies, specifying the properties of the resulting distributed embedded system and the physical environment, and proving that the specifications are met. Application areas include autonomous driving, vehicle management systems, and distributed multi-agent systems.
12/11/2014 - 11:30
The diffusion and relaxation mechanisms of complex materials at the atomic level are still largely misunderstood both theoretically and experimentally. How, indeed, follow the movement of atoms on relevant time to understand aging, fracture propagation and self-assembly? It is necessary for this, close the gap between the thermal vibrations, which occur on a picosecond (10th-12 s), and these phenomena, which take place on time exceeding the second time and year. The scientific community was concerned by this problem for decades and many methods, long very limited, have been proposed over the years. These efforts, coupled with the constant improvement of the performance of supercomputers, have led to significant advances in recent years that finally allow to bridge the gap between the thermal vibrations and structural changes over long periods for generic materials. As part of this seminar, I will present, in brief, problems and changes in the methods that allow us to solve this large dynamic range.
27/05/2014 - 17:00
Networking is about sharing resources, from sharing links with packet-switching to sharing servers in the cloud with virtual machines and parallel algorithms. Progress is driven mostly by clever inventions. In many cases, modeling and analysis lead to improved schemes. The time and space scales of networks necessitate distributed algorithms. Analyzing these algorithms often requires new methods for approximating the complex random dynamics and simplifying the objectives of the control schemes. The demand and available resources are determined by users and businesses. Thus, an economic layer interacts with the technology layer. These interactions are governed by relationships between demand, quality, and prices. These relationships may result from strategic behavior. As the technology evolves, new problems emerge that raise new challenges for researchers. In this talk, I review some of the problems that have fascinated me.
28/10/2013 - 18:30
Recently, the European Commission selected the Human Brain Project (HBP) as one of two Flagship projects, each receiving a prospected funding of 1 billion euro. The Human Brain Project builds on the foundations of the Blue Brain Project which was founded in 2005 with the mission to integrate anatomical, electrophysiological and genetic data into a unified model of the neocortical column. Over the course of 10 years, the HBP will create an integrated system of six ICT platforms, dedicated to Neuroinformatics, Brain Simulation, Medical Informatics, High Performance Computing, Neuromorphic Computing, and Neurorobotics. These platforms will lay the foundation for a new kind of ICT based brain research. Three of the HBP platforms are dedicated to investigating novel, brain-derived technologies for future computers and computer science. The High-Performance Computing Platform will use brain-derived principles to design the next generation of super-computers. The Neuromorphic Computing Platform investigates how brain-derived principles can be directly embodied in new computing hardware. Finally, the Neurorobotics Platform will allow researchers to use and explore brain models in simulated closed-loop robotics experiments. In this talk I will give an overview of the Human Brain Project. I will then describe in more detail the preparatory research for the Neurorobotics Platform, on the basis of the pioneering work of the Blue Brain Project. Along the way, I will summarize the challenges and opportunities the Human Brain Project offers to computer science.
28/10/2013 - 16:30
The talk discusses recent developments within the machine learning community that are tailored to modern large-scale problems. (1) In supervised learning problems with many observations, online algorithms such as stochastic gradient descent which pass over the data only once, are usually preferred over batch algorithms, which require multiple passes over the data. Recent work are presented, showing that simple improvements leads to algorithms that are several orders of magnitude faster. (2) In unsupervised learning problems, learning appropriate representations (such as principal component analysis or K-means) can be generalized to take into account the natural structures of the problem at hand, leading to both better predictive performance and enhanced interpretability.
28/10/2013 - 13:30
Even if feedback was used in ancient times it is reasonable to say that the field of control appeared in the mid 1940s. Control was the first systems field. It represented a paradigm shift in engineering. An holistic but primitive view of control systems emerged in the 1950s and the International Federation of Automatic Control (IFAC) was formed. Education in control spread rapidly to practically all engineering disciplines. It is, however, still difficult to fit control into a traditional structure divided into Mechanical, Electrical and Chemical Engineering and the field is still struggling with this issue. The second phase, starting in the 1960s, was characterized by a very strong progress in control theory driven by the space race and the emergence of computer control. Theory developed dramatically as did the industrial applications. A large number of sub-specialties appeared but the holistic view of the field was unfortunately lost. In my opinion we are now entering a third phase driven both by ubiquitous computing and networks and a strong interest in feedback and control among our fellow scientists particularly in Physics and Biology. What will happen is largely dependent on how the control engineers and scientists respond to these challenges. The lecture will present some of the key ideas in the development of the field, it will give a glimpse of the rich industrial applications and it will end with a few reflections about the future evolution. Important issues are interactions with other fields and education.
28/10/2013 - 10:30
Game Theory has interesting implications for the competing strategies of cells fighting for survival in a cancer patient. The talk starts with a very short tutorial on Game Theory, follows up with Robert's work using microfabricated ecologies that can accelerate evolution, and concludes with the possibilities towards a serious synthesis of Game Theory and cancer at a quantitative, testable level.