LAAS Research and Technology Days

13/07/2017 - 11:00
Smart Cities are an example of Cyber-Physical Systems whose goals include improvements in transportation, energy distribution, emergency response, and infrastructure maintenance, to name a few. One of the key elements of a Smart City is the ability to monitor and dynamically allocate its resources. The availability of large amounts of data, ubiquitous wireless connectivity, and the critical need for scalability open the door for new control and optimization methods which are both data-driven and event-driven. The talk will present such an optimization framework and its properties. It will then describe several applications that arise in Smart Cities, some of which have been tested in the City of Boston: a “Smart Parking” system which dynamically assigns and reserves an optimal parking space for a user (driver); the “Street Bump” system which uses standard smartphone capabilities to collect roadway obstacle data and identify and classify them for efficient maintenance and repair; adaptive traffic light control; optimal control of connected autonomous vehicles. Lastly, the talk will describe how a large traffic data set from the Massachusetts road network was analyzed to estimate the Price of Anarchy in comparing “selfish” user-centric behavior to “social” system-centric optimal traffic routing solutions.
03/06/2016 - 14:00
A microgrid is an electrical distribution network consisted of distributed generators, local loads, and energy storage systems that can operate in grid-connected or islanded modes. Different technologies are combined together, such us power converters, control, communications, optimization, and so on. This way the energy can be generated and stored near to the consumption points, improving the stability and reducing the losses produced by the large power lines. In distributed energy systems like microgrids, multi-agent systems technologies will be presented distributed control is a powerful tool for distributed. Previous experiences in the Danish electrical system like the Cell Controller project used these technologies to balance dispersed energy generation and consumption. Other examples of real sites including conventional islanded systems installed in islands and rural remote areas will be shown. Finally, low-voltage distribution systems and DC microgrids for residential applications and homes will be introduced. New worldwide projects to develop technologies for low voltage DC distribution systems will be shown.
25/03/2016 - 11:00
The japanese space mission Hayabusa was the first Sample & Return mission from a Near Earth Asteroid. Launched on 9 May 2003 from the japanese Kagoshima Space Center, Hayabusa travelled during two years to reach the Itokawa asteroid. After some proximity studies, Hayabusalanded on the surface and could collect some dust from the surface, overcoming some technological problems. The sampled dust returned on Earth ground in 2010. Professor Junichiro Kawaguchi, Hayabusa's Project Manager, will presents some technological and scientific aspects of this mission. The Hayabusa mission had a high engineering importance for Jaxa in order to test new technologies in the fields of ion engine, close movement around objects with low gravity, autonomous and optical navigation. From the mission analysis to the spacecraft rendez-vous powered by recently developed ionic thrusters, technological and computational challenges were overcome. In fact, the way of combining low thrusts and gravitational assistance to change orbits or ballistic reentry from interplanetary orbit were of great interest. Furthermore, the design of the touch down and the sample recovering were some difficult problems for which new solutions had to be invented. All these technological achievements served the scientific aspects of Hayabusa mission. Scientists' understanding of asteroids has been greatly improved by studying meteorite samples. However, matching up meteorite samples collected on earth with the exact asteroids from which they came is an issue. Hayabusa has helped to overcome this setback by bringing back pristine samples from a specific, well-characterized asteroid. In other words, Hayabusa bridged the gap between ground observation of asteroids and laboratory analysis of meteorite and cosmic dust collections.
25/09/2015 - 11:00
The technique of bioelectrical impedance analysis consists in analyzing cells and tissues by measuring the electrical properties at several frequencies. The electrical properties of tissues, been described since 1872, can differ very much depending of the tissues but also on their physiological status. In recent years, this technique has been refined as a powerful tool for label-free analysis and characterization of single cells. Here we will present the foundations of electrical analysis of cells and how it can be applied for microfluidic flow-cytometer capable of measuring the spectral impedance of individual cells as well as discriminating between cell types according to their dielectric properties. We will detail examples where bioelectrical impedance is used to measure infection in blood or in toxicology and drug resistance screening of cancer cell culture. We will also show how bioelectrical impedance can be used for measuring some characteristics of brain tissues.
26/03/2015 - 17:00
Les systèmes de vol critiques dans les véhicules aérospatiaux doivent atteindre un taux d'échec de moins de 1 échec dans 10 ^ 9 heures de vol (soit moins de 1 pour 100.000 d'échec années de fonctionnement). Les systèmes qui permettent ce niveau de fiabilité sont difficiles à concevoir, difficile à vérifier et difficiles à valider, en particulier si un logiciel est impliqué. Dans cet exposé, je vais parler de quelques-uns des défis auxquels la communauté aéronautique est confrontée dans la conception de systèmes de ce niveau de fiabilité et comment elle commence à les approcher. Nous utilisons des résultats de l'informatique, de la robotique et du contrôle pour développer des méthodes et des outils pour la conception de politiques de contrôle. Les domaines d'application comprennent la conduite autonome, les systèmes de gestion des véhicules, et les systèmes multi-agents distribués.
12/11/2014 - 11:30
Les mécanismes de diffusion et de relaxation des matériaux complexes à l’échelle atomique demeurent encore largement incompris tant d’un point de vue théorique qu’expérimental. Comment, en effet, suivre le mouvement des atomes sur des temps pertinents pour comprendre le vieillissement, la propagation des fractures et l’autoassemblage ? Il faut, pour cela, combler l'écart entre les vibrations thermiques, qui se produisent sur une picoseconde (10e-12 s), et ces phénomènes, qui se déroulent sur des temps dépassant la seconde, l'heure et l'année. Ce problème épineux préocuppe la communauté scientifique depuis plusieurs décennies et de nombreuses méthodes, longtemps très limitées, ont été proposées au fil des années. Ces efforts, couplés à l'amélioration constante de la performance des superordinateurs, ont mené à des avancées importantes ces dernières années qui permettent finalement de faire le pont entre les vibrations thermiques et l'évolution structurelle sur des temps longs pour des matériaux génériques. Dans le cadre de ce séminaire, je vous présenterai, en raccourci, les problèmes et l'évolution des méthodes qui nous permettent aujourd'hui de résoudre ce grand écart dynamique.
27/05/2014 - 17:00
Networking is about sharing resources, from sharing links with packet-switching to sharing servers in the cloud with virtual machines and parallel algorithms. Progress is driven mostly by clever inventions. In many cases, modeling and analysis lead to improved schemes. The time and space scales of networks necessitate distributed algorithms. Analyzing these algorithms often requires new methods for approximating the complex random dynamics and simplifying the objectives of the control schemes. The demand and available resources are determined by users and businesses. Thus, an economic layer interacts with the technology layer. These interactions are governed by relationships between demand, quality, and prices. These relationships may result from strategic behavior. As the technology evolves, new problems emerge that raise new challenges for researchers. In this talk, I review some of the problems that have fascinated me.
28/10/2013 - 18:30
Recently, the European Commission selected the Human Brain Project (HBP) as one of two Flagship projects, each receiving a prospected funding of 1 billion euro. The Human Brain Project builds on the foundations of the Blue Brain Project which was founded in 2005 with the mission to integrate anatomical, electrophysiological and genetic data into a unified model of the neocortical column. Over the course of 10 years, the HBP will create an integrated system of six ICT platforms, dedicated to Neuroinformatics, Brain Simulation, Medical Informatics, High Performance Computing, Neuromorphic Computing, and Neurorobotics. These platforms will lay the foundation for a new kind of ICT based brain research. Three of the HBP platforms are dedicated to investigating novel, brain-derived technologies for future computers and computer science. The High-Performance Computing Platform will use brain-derived principles to design the next generation of super-computers. The Neuromorphic Computing Platform investigates how brain-derived principles can be directly embodied in new computing hardware. Finally, the Neurorobotics Platform will allow researchers to use and explore brain models in simulated closed-loop robotics experiments. In this talk I will give an overview of the Human Brain Project. I will then describe in more detail the preparatory research for the Neurorobotics Platform, on the basis of the pioneering work of the Blue Brain Project. Along the way, I will summarize the challenges and opportunities the Human Brain Project offers to computer science.
28/10/2013 - 16:30
The talk discusses recent developments within the machine learning community that are tailored to modern large-scale problems. (1) In supervised learning problems with many observations, online algorithms such as stochastic gradient descent which pass over the data only once, are usually preferred over batch algorithms, which require multiple passes over the data. Recent work are presented, showing that simple improvements leads to algorithms that are several orders of magnitude faster. (2) In unsupervised learning problems, learning appropriate representations (such as principal component analysis or K-means) can be generalized to take into account the natural structures of the problem at hand, leading to both better predictive performance and enhanced interpretability.