The demands of modern software development mean that developers are being asked to write more code and deliver new functionality faster than ever before. The adage “don’t reinvent the wheel” has given many people the shortcuts they need to deliver code quickly and, in most cases, reliably.
Instead of writing something from scratch, a software developer must find something suitable in a program library or source code repository. These days, developers also have the ability to harness the power contained in microservices.
Microservices radically change the relationship between software developers and the code needed to achieve the desired functionality, as Shaun O’Meara, Chief Technology Officer at Mirantis explains: “In the past, you typically had one developer or a small team of developers developing each component of the system, each agreeing to work on different components.”
The team had to build everything from scratch, he says, but when software libraries became available, the developer could take advantage of the prebuilt functionality. The big change with microservices is that the mentality of software developers has changed – they can now consume work developed by other people and thus achieve huge productivity gains.
The impact, he says, is that code using microservices tends to consume more IT infrastructure than code developed in a more traditional way.
Increase in coding inefficiency
Inefficiency is now commonplace in software development. “Modern tools have made people lazy,” says Andy Powell, chief technology officer at Canterbury Christ Church University. “When I wrote web pages – it was before .Net appeared, and in classic ASP [active server pages] – you had to write all your subjects yourself.”
When people used to visit Web sites via a low-bandwidth dial-up modem connection, he says, “You had to be careful about image size, style sheets, and page size; you had to be aware of how much data you were sending downstream because load times were important.”
For Powell, from an application development perspective, this meant that developers considered code efficiency. “You had to be really efficient with the database layer and your API [application programming interface] layer,” he says.
Queries to transactional systems are written to return the minimum set of viable data, whereas now, he says, “You get 100,000 records or tuples and you go, you pick what you want from this [dataset] in memory because memory has become so cheap.”
Addressing code bloat
Today, developers treat bandwidth as an almost unlimited resource and processing power, and memory and storage are cheap and plentiful. This has led to code bloat, where developers no longer focus on writing software that runs as efficiently as possible and uses the smallest footprint of storage, memory, and processing power.
Mav Turner, chief product and strategy officer at Tricentis, points out that code overload typically stems from several sources, such as verbose syntax, redundant or unused features, and lack of optimization during development. Additionally, he says legacy codebases can accumulate technical debt over time, leading to bloated and convoluted implementations.
However, as Turner explains, “By adopting clean coding practices, modular design principles, and regular refactoring, developers can mitigate code bloat and maintain smaller, more manageable code bases.”
IT leaders must consider the factors and drivers that make code developers less efficient at writing. “No one intends for the code to be inflated. Developers are not trying to destroy the environment,” says Maurice Kalinowski, Product Director at Qt.
However, as Kalinowski notes, there are numerous factors that cause unintentional inefficiencies. As an example, he says, “Very often, prototypes end up in product because of time pressures that force development teams to short delivery cycles. This leads to technical debt later in the development life cycle, which in turn has the knock-on effect of harming efficiency.”
According to Kalinowski, it is important to consider the scope and use case of the code being developed. “If you’re developing ‘general purpose’ code, does the input need semantic validation? Does the code cover all possible scenarios, including those that may never happen?”
Even very efficient code can be exposed to additional use cases that require different checks and processing. This, Kalinowski warns, leads to more and more cases where code is written to support these exceptions. As these accumulate, the code reaches a point where the performance benefits originally designed for it may be lost.
Kalinowski says IT leaders should consider refactoring those parts of the code that are bloated. “Of course, first you need to know which projects are inflated in the first place. This is where a host of developer tools come into play, such as static and dynamic code analysis, profiling and enhanced sampling,” he adds.
Inefficiencies in testing
Tricentis’ Turner encourages IT decision makers to adopt test-driven development (TDD) as a viable IT methodology. In his experience, TDD offers a powerful technique that can significantly contribute to the creation of green code, characterized by higher quality and efficiency.
“By emphasizing the creation of tests before writing code, TDD ensures that developers have a clear understanding of the expected behavior and functionality of their code from the very beginning,” says Turner.
Looking at testing during application development, Ved Sen, head of innovation at TCS UK and Ireland, says IT leaders should also consider the impact of regression testing on the environment.
“When you do regression tests, you end up testing a lot of things over and over again, just to see if the program breaks,” he says. “But every time you do a regression test, you’re consuming more resources, and each one creates a small carbon footprint.”
According to Sen, it should be possible to build more intelligent ways of testing, so that developers don’t have to test the same use case over and over again.
Sen points out that if software developers avoid brute force testing, they can reduce the footprint of the IT test and development environment by a small but significant amount, which cumulatively has a greater impact on making IT greener and less carbon intensive.
Beyond coding, IT leaders can also consider addressing the overall environmental impact of the software development and testing environments their developers require.
Speaking at KubeCon + CloudNativeCon in Paris in March, Gualter Barbas Baptista, lead consultant for platform enablement and strategy at Deutsche Bahn, discussed the rail operator’s ongoing efforts to monitor and minimize the environmental impact of its cloud-based applications. Baptista talked about empowering developers, describing software developers as “really the ones making the decisions every day” in terms of what gets put into the software.
“If we don’t engage developers and give them the tools, we won’t be able to change the way we develop code and the way we manage infrastructure,” he says.
Over the past few years, Deutsche Bahn has focused on merging entire subsidiaries in order to implement standardization. This, he says, means “we can leverage the effects and ensure a higher level of standardization.”
Kubernetes is a platform building tool used at Deutsche Bahn. Monitoring allows IT administrators to see CPU utilization and automatically adjust container workloads, optimizing them according to workload needs.
Scheduling is also used to ensure Deutsche Bahn that development and test environments can be put to sleep when developers are not working, saving processing power.
Greening AI
The IT landscape is constantly evolving, which means that IT sustainability is a moving target. In a panel discussion at the KubeCon + CloudNativeCon event, Chuck Dubuque, head of direct product marketing for OpenShift at RedHat, warned that artificial intelligence (AI) is taking Kubernetes somewhere it hasn’t been before.
“When you add AI to an application, you increase its energy consumption by 10 times,” he said.
Looking at approaches to make AI greener, Oliver King-Smith, CEO of SmartR AI, says researchers are developing efficient methods for building and using AI, such as model reuse, ReLora, Mixture of Experts (MoE) models and quantization.
Discussing model reuse, King-Smith says the technique involves retraining an already trained model for a new purpose, saving time and energy compared to training from scratch. “This approach not only saves resources, but also often results in better performing models,” he says. “Both Meta and Mistral have been good at releasing reusable models.”
Looking at ReLora and Lora, King-Smith says they are designed to reduce the number of calculations required when retraining models for new uses. This saves energy and enables the use of smaller computers that consume less energy. “This means that instead of relying on large, power-intensive systems like Nvidia’s DGX, a modest graphics card can often be sufficient for retraining,” he says.
MoE models, such as those recently published by Mistral, have fewer parameters than conventional models. This, King-Smith says, results in fewer calculations and lowers power consumption. “The MoE models activate only the necessary blocks when in use, similar to turning off lights in unused rooms, leading to a 65% reduction in energy consumption.”
King-Smith describes quantization as a technique that reduces the size of an AI model. “By quantizing the model, the number of bits needed to represent each parameter is reduced. This reduces the size of the model, allowing the use of less powerful and more energy-efficient hardware,” he says.
Although quantization can have a small effect on model accuracy, King-Smith argues that for many practical applications this trade-off is not noticeable.
Addressing code bloat and unnecessary levels of regression testing helps make coding greener. There are also options to use more efficient microservices or algorithms. But the general consensus among industry experts is that it is very difficult to change something that software developers are used to.
Mirantis’ O’Meara sees an opportunity to address green IT in software development from an IT infrastructure perspective. “If we can remove the complexity and offer only the components of the IT infrastructure that are needed, then we are able to create a thin layer over the entire IT infrastructure,” he says.
Kubernetes can also be used to ensure that software development and test environments do not use IT resources unnecessarily.
Such techniques enable the IT infrastructure to be light and energy efficient. A similar technique, as Qt’s Kalinowski points out, can be used in coding to reduce the number of different scenarios that lead to exceptions that developing code must handle.