School of Engineering wins three prestigious ERC Consolidator Grants

Three ERC Consolidator Grants were awarded to researchers at the School of Engineering (STI). Their research projects focus on developing computers that are 1,000 times more energy efficient, regulating gene expression and revolutionizing data processing.

Thanks to their bold, groundbreaking projects, EPFL professors David Atienza, Volkan Cevher, and Sebastian Maerkl each won a prestigious European Research Council (ERC) Consolidator Grant.

The computer of the future, inspired by the human brain

With his Compusapien project, David Atienza, head of the Embedded Systems Laboratory (ESL), aims to create computers that are faster and 1,000 times more energy efficient than what’s currently on the market. The key concept is to revamp server architecture and use miniscule microfluidic channels to both cool the system and convert heat into electricity.

Atienza was inspired by the human brain. He initially came up with the idea of grouping the calculation units by function, “like the specialized groups of neurons in our brains,” he said. The computer’s processors and memory, typically aligned in a 2D structure, are instead re-designed and clustered in a 3D integration architecture, resulting in superior processing speed and much smaller latency to access large volumes of data, which are needed in Big Data applications.

To prevent the system from overheating, he puts microfluidic channels of 50-100 microns height in between each layer. These channels serve two functions: they cool the system with the fluid that flows through them, and they transform the heat they absorb into electricity via reactions of miniscule fuel cells contained in the channels. The electricity is then reinjected into the server system, generating substantial energy savings.

Atienza has already demonstrated the microfluidic channel technology feasibility on a conventional IBM server, through work carried out with Dr. Bruno Michel’s group at IBM Zurich. They were able to generate 6 watts of electricity from a server that uses 30. “We could even triple that figure by playing with the server architecture, the shape of the channels and the properties of the used microfluidic fuel cells,” he said. An ambitious project with correspondingly ample rewards. “We also need to rethink the power consumption of the devices we currently use. Even if you could drive a Ferrari every day, most of the time a Twingo would suffice.”

ERC Project: Computing Server Architecture with Joint Power and Cooling Integration at the Nanoscale

A counterintuitive way to process data

Volkan Cevher, who heads the Laboratory for Information and Inference Systems, took on the challenge of efficiently analyzing ever-increasing amounts of digital data. This is a real problem, since computer processing power simply cannot keep up with the explosion in the volume of data to crunch.

Against conventional belief, Cevher developed pioneering mathematical methods and learning algorithms that deliver the best trade-offs between the amount of data processed versus speed and power.

“People typically think that a large amount of data requires a long time to process,” Cevher said. “But we showed that this dogma is false by harnessing the data itself – rather than seeing it as the problem.” In other words, more data translates into faster data processing. “Take Sudoku as an example. The more numbers you have in the grid, the faster you can solve the puzzle. It’s pretty much the same idea, even if that seems counterintuitive,” Cevher said.

Based on this idea, he showed that the best results could be achieved with the right balance among data processing time, data collection power, the bandwidth available at that point in time and the amount of data required. By carefully balancing these variables, the approach can be tailored to any given situation. This includes calculating the minimum amount of data needed to get an accurate result.

There are numerous applications for Cevher’s work. Neural signal acquisition or Medical imaging are good examples: if radiologists can calculate the minimum amount of data they need to generate an image, they can reduce patients’ exposure to radiation during CT scans. “Our algorithms can teach MRI machines to more quickly recognize the things doctors want to detect,” Cevher said.

Looking ahead, Cevher plans to apply his method to complex data processing problems in the realm of super-resolution microscopes, for example, but also in materials science, where he hopes to automate the virtual creation of new materials.

This is the second ERC received by Volkan Cevher.

ERC Project: Time-Data Trade-Offs in Resource-Constrained Information and Inference Systems

Understanding, regulating, and reproducing gene expression

Sebastian Maerkl, head of the Laboratory of Biological Network Characterization, seeks to understand and reproduce the complex mechanism of gene expression. This fundamental process is still poorly understood but lies at the heart of how every component of our bodies was created.

Depending on the environment they’re in, human cells decide which gene should be expressed and with what intensity, so as to survive or grow. “The genomes in these cells are like a blueprint for a building. Based on their environment, they decide whether to build a cabin, a mobile home, or a mountain chalet,” Maerkl said.

Maerkl and his team will use a powerful high-speed microfluid tool to study this process in yeast cells, which are similar to human cells. Their research comprises four steps.

1. Promoters
When a cell’s genes are expressed, some of the proteins that are made – called transcription factors – serve only to activate other genes by attaching to promotors. The first step of Maerkl’s research will be to measure the reactions of different promoters and develop a model for describing this mechanism.

2. Cell well-being
The research will also try to identify at what point a cell should express its genes in order to survive in different environments or to grow. “We want to know what exactly the right level of gene expression is for a cell to be in its optimal state. We will try to optimize the genes and measure the limits.”

3. Building new transcription factors
With these tools, Maerkl’s team will artificially create transcription factors and use them to regulate genes.

4. An artificial gene expression network
The final step will involve reverse-engineering the entire network responsible for gene expression. In other words, creating an exact replica of what occurs naturally in cells.

“We want to both improve our fundamental understanding of biological processes and increase the chances of being able to artificially create entire biological systems. The goal is to get exactly the same results as those produced by natural networks,” Maerkl said.

ERC Project: Reverse Engineering Gene Regulatory Networks

Photos: Alban Kakulya

Profs. Atienza, Cevher and Maerkl