ChatGPT for Smart Cities

Crazy for ChatGPT? Some cities are going for it

It’s hard to predict if ChatGPT will live up to the hype. The first-of-its kind technology backed by Microsoft debuted in November 2022, followed a few months later by Google’s version called Bard. Like any other potentially disruptive innovation, it was welcomed by both enthusiasm and skepticism, but it quickly and indisputably became a hot topic and triggered conversations about Artificial Intelligence and how it impacts society and business.

Which benefits may ChatGPT bring to cities and local governments? Administration and finance may be the first departments to take advantage of it. In Vietnam, the HMC City Department of Information and Communications is encouraging researchers and scientists to apply ChatGPT to the state management system to streamline workflows and make procedures more efficient, also acknowledging it could contribute to the design of new services to better serve people and businesses. In the US, a new ChatGPT-based tool for municipal budgeting has just emerged from beta testing, ready for cities that seek help to produce their budget books, completed with figures and text narratives about spending.

But some cities are taking some steps forward in piloting ChatGPT applications. As reported by Cities Today, Singapore is using AI language models, such as those that underpin ChatGPT, to support civil servants in ordinary tasks such as crafting policy papers, summarizing news, answering citizen queries, or managing long documents. The tool was developed by Open Government Products, Singapore’s in-house team that shapes technology to solve public sector challenges, and proved to be effective in speeding up researching and writing processes, while ensuring consistency and quality of the output.

The first utility globally to use ChatGPT is the Dubai Electricity and Water Authority, which announced a new application to improve customer support. It leverages ChatGPT’s ability to interact with users through natural language to better dialogue with them, providing quick and reliable answers to their enquiries. The technology is also being piloted to write programming code and solve coding problems.

Of course, ChatGPT isn’t without criticism and its use by public bodies raised attention about possible risks related to privacy and data protection, cyber security, and other potential misuse. The opportunities for tech companies are clear enough – and the BigTech rush may soon be tied by China’s giants: Beijing is supporting key firms to invest in an open-source framework to challenge ChatGPT and develop a rival platform.

air pollution

Leverage AI to fight air pollution in India

Artificial Intelligence (AI) is having disruptive effects in many industries and areas of life, and it is increasingly used in Smart Cities to tackle urgent problems such as air pollution.

According to Greenpeace analysts, 22 out of 30 world’s most polluted cities are in India, where every winter more than 140 million people are exposed to severe air pollution. In national capital Delhi, PM2.5 air pollution claimed approximately 54,000 lives in 2020.

Independent research stated India would require a minimum of 4000 controlling stations to monitor air quality – but today, there are roughly 160 active stations in the country, an absolutely inadequate network to collect reliable and meaningful data to support any evaluation or decision-making process.

Here is where AI comes into play. Detailed air quality monitoring should combine different sources of information, including data generated by air controlling stations, weather and satellite data, but also human-related activities such as mobility and traffic, industrial settlements, waste management and garbage burning. AI can be leveraged to correlate all those factors and provide a better geospatial interpolation of air pollution data, thus supporting forecasts by a more precise understanding of pollution sources and evolutive trends.

Citizens can also offer a valuable contribution. In India, the UNDP Accelerator Lab developed a GeoAI digital platform in collaboration with the University of Nottingham to find hotspots of air pollution using satellite imagery and AI object detection algorithms. These algorithms were trained by a large group of volunteer citizen scientists across the world. Specifically, their goal was to detect brick kilns which are hotspots of vulnerable labour and air pollution. Applying AI algorithms to the citizen science trained data, more than 47,000 brick kilns across Indo-Gangetic plains of India were detected and incorporated into the GeoAI open data platform, which uses an innovative approach and a mix of technologies to determine the exact locations of brick kilns through satellite imagery and their compliance to existing environmental policies and laws.

With active facilitation and training, volunteers succeeding in classifying more than 2,500 kilns within a week: such intelligence powered by both citizens and AI is valuable for environmental regulators to initiate action against non-compliant kilns, facilitating targeted intervention to combat air pollution hotspots.

Air pollution in India, as well as in other countries around the world, represents a serious health issue and decreases the quality of life. AI can contribute to take mitigation actions by providing location-specific air quality data and useful insights to authorities, industries, businesses, and citizens.

estimate daytime

Author: Paolo Veronesi, software engineer at Paradox Engineering


In an increasingly complex and interconnected world, Machine Learning is proving to be highly beneficial for Smart Cities and any IoT application where large and high-dimensional volumes of data need to be processed, correlated, and actioned.

At Paradox Engineering we are piloting neural networks in specific use cases such as municipal solid waste collection or patient monitoring in hospital and clinics, both requiring medium to high computational powers. Neural networks are generally applied when it is possible to exploit large computing resources (like workstations or servers), but we are now exploring how to use these technologies in a diametrically opposed occurrence – thus with extremely limited resources (such as few hundreds KB of ram, and standard 32-bit processors with less than a hundred MHz).

Leveraging neural networks on embedded IoT devices is a new, emerging application. Our goal is to process data locally, on the embedded device, without sending information over the network: this allows the devices to work properly even in case of network failures, improves the scalability of the system and its overall security, as sensitive data are processed at local level.

The use case we are focusing is Smart Lighting. In a typical urban outdoor installation, streetlights are connected to a mesh network and managed by gateways, operating as border routers, network coordinators, and data concentrators. A central management software enables the remote management and control of the network, as well as of single or grouped devices.

Streetlights are expected to run smoothly even when the connectivity to the nearest gateway or the CMS is not available. To allow this, devices are configured to execute their routines based on current date and time, assuming the gateway provides this information. This is vital for streetlights to derive the ephemeris and calculate daily sunrise and sunset – according to this data, lamps can switch on/off and dim according to programmed schedules.

What happens if a node is isolated and cannot connect to a gateway, for example because of adverse environmental conditions or network topology? Or if the gateway is not installed at all? In such occurrences, the device should find a new way to calculate the current time. We decided to leverage data analysis and Machine Learning capabilities to let streetlights derive exact daytime from data collected from integrated environmental sensors.

Estimating time from parameters such brightness, temperature, or pressure, isn’t a simple task. Environmental conditions vary a lot from day to day, from season to season, with a level of complexity that it would not be possible to solve with a traditional programming approach.

Neural networks can efficiently manage time series and, if trained with an adequate quantity and variety of data, can provide an accurate answer to our question. In our experiment, we trained the system using data generated by a set of environmental sensors over the course of one year.

Results were really promising: isolated streetlights were able to process data locally and estimate the right daytime with an average accuracy of about 16 minutes – and perfectly execute their schedules. Without Machine Learning, this operational continuity would not have been possible.


Learn more about our Smart Lighting application: download our latest white paper (free registration required), and contact our Machine Learning experts!