News Articles & Media Mentions
ATLANTA — As cities have begun to collect and release unprecedented amounts of data, questions about citizen privacy have become increasingly relevant. Local governments, for their part, often lack specific privacy policies and rely on checks such as community outcry, industry best practices and guidance from law professors to dictate the limits of their work. This was an overarching topic at many panels during the recent MetroLab Annual Summit.
Perception is also important, as Chicago and its collaborators learned upon launching the Array of Things project, an influential smart cities initiative made up of thousands of nodes. Array of Things, which is in the process of being spread to other cities, was born of a collaboration between the city and researchers at the University of Chicago, like Charlie Catlett, the director of the Urban Center for Computation and Data.
Catlett said that when they were first setting up the nodes that collect data for the Array of Things project, the community was skeptical of any government effort to collect info, so technologists had to learn to become very deliberate when they explained what they were doing and why.
Speaking at SC17 in Denver this week, a panel of smart city practitioners shared the strategies, techniques and technologies they use to understand their cities better and to improve the lives of their residents. With data coming in from all over the urban landscape and worked over by machine learning algorithms, Debra Lam, managing director for smart cities & inclusive innovation at Georgia Tech who works on strategies for Atlanta and the surrounding area, said “we’ve embedded research and development into city operations, we’ve formed a match making exercise between the needs of the city coupled with the most advanced research techniques.”
Panel moderator Charlie Cattlett, director, urban center for computation & data Argonne National Laboratory who works on smart city strategies for Chicago, said that the scale of data involved in complex, long-term modeling will require nothing less than the most powerful supercomputers, including the next generation of exascale systems under development within the Department of Energy. The vision for exascale, he said, is to build “a framework for different computation models to be coupled together in multiple scales to look at long-range forecasting for cities.”