Environment Variables Year Three Roundup
It’s been three years of Environment Variables! What a landmark year for the Green Software Foundation. From launching behind-the-scenes Backstage episodes, to covering the explosive impact of AI on software emissions, to broadening our audience through beginner-friendly conversations; this retrospective showcases our mission to create a trusted ecosystem for sustainable software. Here’s to many more years of EV!Learn more about our people:Chris Adams: LinkedIn | GitHub | WebsiteAnne Currie: LinkedInChris Skipper: LinkedInPindy Bhullar: LinkedInLiya Mathew: LinkedInAsim Hussain: LinkedInHolly Cummins: LinkedInCharles Tripp: LinkedInDawn Nafus: LinkedInMax Schulze: LinkedInKillian Daly: LinkedInJames Martin: LinkedInFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterResources:Backstage: TOSS Project (02:26)Backstage: Green Software Patterns (04:51)The Week in Green Software: Obscuring AI’s Real Carbon Output (07:41)The Week in Green Software: Sustainable AI Progress (09:51)AI Energy Measurement for Beginners (12:57)The Economics of AI (15:22)How to Tell When Energy Is Green with Killian Daly (17:47)How to Explain Software to Normal People with James Martin (20:29)If you enjoyed this episode then please either:Follow, rate, and review on Apple PodcastsFollow and rate on SpotifyWatch our videos on The Green Software Foundation YouTube Channel!Connect with us on Twitter, Github and LinkedIn!TRANSCRIPT BELOW:Chris Skipper: Welcome to Environment Variables from the Green Software Foundation. The podcast that brings you the latest in sustainable software development has now been running for three years.So that's three years of the latest news in green software, talking about everything from AI energy through to the cloud, and its effect on our environment and how we as a software community can make things better for everybody else.This past year Environment Variables has truly embodied the mission of the Green Software Foundation, and that's to create a trusted ecosystem of people, standards, tools, and best practices for creating and building green software. Now this episode's gonna feature some of the more key episodes that we did over the last year.We're gonna be looking at a wide variety of topics and it's going to be hopefully a nice journey back through both the timeline of the podcast, but also the landscape of green software over the last year and how it has dramatically changed, not only due to the dramatic rise in use of AI amongst other things, but also just to the fantastic ideas that people have brought to the table in order to try and solve the problem of trying to decarbonize software. So without further ado, let's dive in to the first topic.Chris Skipper: First, we brought about a new change in the way the podcast was structured. A new type of episode called Backstage.Backstage is basically a behind the scenes look at the Green Software Foundation, internal projects and working groups. It's a space for our community to hear directly from project leaders to share the wins and their lessons learned and reinforce trust and transparency, which is one of the core tenets of the Green Software Foundation Manifesto.Now, there were a bunch of great projects that were featured over the last year. We're gonna look at two specifically.In our first backstage episode, we introduced the TOSS project. TOSS stands for Transforming Organizations for Sustainable Software, and it's led by the fantastic Pindy Bhullar. This project aims to embed sustainability into business strategy and operations through a four pillar framework.. It's a perfect example of how the foundation operationalizes its mission to minimize emissions by supporting organizations on their sustainability journey.Let's hear the snippet from Pendi explaining these four pillars.Pindy Bhullar: Transforming organizations for sustainable software is the acronym for toss. Businesses will be able to utilize the toss framework as a guide to lay the groundwork for managing change and also improving software operations in the future, software practices within organizations can be integrated with sustainability in a cohesive and agile manner, rather than addressing green software practices in an isolated approach.For a company to fully benefit from sustainable transformation of their software development processes, we need to review all aspects of technology. The Toss framework is designed to be embedded across multiple aspects of its business operations. Dividing the task framework along four pillars has allowed for simultaneous, top down and bottom up reinforcement of sustainable practices, as well as the integration of new tools, processes, and regulations that I merge over time.The four pillars aim to foster a dynamic foundation for companies to understand where to act now, to adjust later and expand within organizational's sustainable software transformation. The four pillars are strategy, implementation. Operational compliance and regulations and within each of the pillars, we have designed a decision tree that will be constructed to guide organizations in transforming their software journey.Chris Skipper: Some fantastic insights from pindi there, and I'm sure you can agree. The Toss project has an applicability outside of just software development. It's one of those projects that's really gonna grow exponentially in the next few years. Next up, we have green software patterns. Green software Patterns Project is an open source initiative designed to help software practitioners reduce emissions by applying vendor neutral best practices. Guests, Franziska Warncke and Liya Mathew; project leads for the initiative discussed how organizations like Aviva and MasterCard have successfully integrated these patterns to enhance software sustainability. They also explored the rigorous review process for new patterns, upcoming advancements, such as persona based approaches and how developers and researchers can contribute to the project.That's one thing to remember about Backstage is actually highlights that there are so many projects going on at the GSF. We actually need more people to get involved. So if you are interested in getting involved, please Visit greensoftware.foundation to find out more. Let's hear now from Liya Mathew about the Green Software Patterns Project.Liya Mathew: One of the core and most useful features of patterns is the ability to correlate the software carbon intensity specification. Think of it as a bridge that connects learning and measurement. When we look through existing catalog of patterns, one essential thing that stands out is their adaptability.Many of these patterns not only aligned with sustainability, but also coincide with security and reliability best practices. The beauty of this approach is that we don't need to completely rewrite a software architecture. To make it more sustainable. Small actions like catching static data or providing a dark mode can make significant difference.These are simple, yet effective steps that can lead us a long way towards sustainability. Also, we are nearing the graduation of patterns V one. This milestone marks a significant achievement and we are already looking ahead to the next exciting phase. Patterns we two. In patterns we two, we are focusing on persona based and behavioral patterns, which will bring even more tailored and impactful solutions to our community.These new patterns will help address specific needs and behaviors, making our tools even more adaptable and effective.Chris Skipper: Moving on. We also kept our regular episode format The Week in Green Software, also known affectionately as Twigs. So Twigs was originally hosted by Chris Adams and is now occasionally hosted by the Fabulous and Currie as well.It offers quick actionable updates in the green software space with a rising sustainability news. With a rising tide of sustainability and AI developments, this format helps listeners stay current. I can tell you now that in the last year, the number of news topics has just exploded when it comes to anything to do with AI and the impact it's having on the environment.And I think part of that is due to the work of the GSF and its community members. We used to have to really struggle to find news topics when this podcast first started back in 2022. But now in 2025, every week, I would say nearly every hour, there's a new topic coming out about how software is affecting the environment.I. So The Week in Green Software is your one stop place for finding all that information dialed down into one place. And also you can sign up to the GSF newsletter as well via the link below, which will give you a rundown of all the week's latest new topics as well. So let's look at a couple episodes of twigs from the previous year.The first one is an episode with the executive director of the GSF Asim Hussain. Asim really embodies the mission of the GSF in so many ways and is always passionate about the effect that software is having on the environment. In this episode, which was subtitled, Obscuring AI's Real Carbon Output , Asim joined Chris to unpack the complexities of AI's, carbon emissions, renewable energy credits, and regulatory developments.This episode emphasized the need for better carbon accounting practices; work the foundation is helping to advance. Let's hear this little snippet from Asim now.Asim Hussain: You can plant a tree, right? And then you planted the tree. That tree will grow and there's issue there. This drought tree will grow and it'll suck carbon from the atmosphere. And you can say that's a carbon credit at planting a tree. Or there's carbon avoidance offsets and there's many variant, and that's actually very good variance of carbon avoidance offsets.But there is a variant of a carbon avoidance offset where I've got a tree and you pay me not to cut it down. And so where is the additionality? If I'm actually planting a tree, it's happening and planting a tree. I'm, I'm, I'm adding additional kind of capacity in, in carbon removal. And then the renewable energy markets is exactly the same.You can have renewable energy, which if you buy means a renewable power plant is gonna get built and you can have renewable energy, which is just kind of sold. And if you buy it or you don't buy, there's no change. Nothing's gonna happen. There's no more new renewable plant's gonna get built. Only one of them has that additionality component.And so therefore, only one of them should really be used in any kind of renewable energy claims. But both of them are allowed in terms of renewable energy claims.Chris Skipper: One of the things I love about the way Asim talks about software in general is always, he uses idioms like that planting of a tree to explain a real complex, uh, topic and make it more palatable for a wider audience, which is something that we're gonna explore later on in this episode as well. But before we do that, let's move on to another episode of The Week in Green Software, which was subtitled Sustainable AI Progress.I think you can see a theme that's been going on here. This was our hundredth episode, which was a massive milestone in its own, and the Fantastic Anne Currie hosted Holly Cummins to explore light switch ops, zombie servers, and sustainable cloud architecture. This conversation. Perfectly aligns with the foundation's mission to minimize emissions through smarter, more efficient systems, and having the really knowledgeable, brilliant.Holly Cummins on to talk about light switch ops was just fantastic. , Let's listen to this next clip from her talking about light switch ops.Holly Cummins: We have a great deal of confidence that it's reliable to turn a light off and on, and that it's low friction to do it. And so we need to get to that point with our computer systems and, and you can sort of, uh, roll with the analogy a bit more as well, which is in our houses, it tends to be quite a manual thing of turning the lights off and on.You know, I, I, I, you know, I. Turn the light on when I need it. In institutional buildings, it's usually not a manual process to turn the lights off and on. Instead, what we end up is we end up with some kind of automation. So like often there's a motion sensor. So, you know, I used to have it that, um, if I would stay in our office late at night.At some point if you sat too still because you were coating and deep in thought, the lights around you would go off and then you'd have to like wave your arms to make the lights go back on. And it's that, that, you know, it's this sort of idea of like, we can detect the traffic, we can detect the activity and not waste the energy.And again, we can do. Exactly this with with our computer system so we can have it so that it's really easy to turn them off and on. And then we can go one step further and we can automate it and we can say, let's script to turn things off at 5:00 PM because we're only in one geoChris Skipper: So as you can see, there's always been this theme of Rise in AI, you know, and I think everybody who's involved in this, uh, community and even people outside of it are really kind of frightened and scared of the impact that AI is having on the environment. But one thing that the GSF brings is this anchoring, this hope that there is actually change for the better.And there are people who are actively working against that, within the, within the software industry. And. There's, there's actually gonna be a lot of change coming in the next year, which will make things a lot more hopeful, uh, for the carbon output of the software industry. So between 2024 and 2025 AI's impact on the environment became one of the most discussed topics in our industry, and obviously on this podcast.In 2023 alone data center, electricity consumption for AI workloads was estimated to grow by more than 20%. With foundation models like ChatGPT four, using hundreds of megawatt hours per training run,obviously there are a lot of statistics out there that are quite frightening, but hopefully Environment Variables brings you some peace of mind. And with that, we wanted to expand our audience to a wider group of people that weren't just software developers to make things more palatable for your everyday computer user, for example. ,So one of those episodes that we're gonna feature around that move to try and increase our audience growth is an episode called AI Energy Measurement for Beginners, where Charles Tripp and Dawn Nafus helped us break down how AI's energy use is measured and why it's often misunderstood.Their beginner friendly approach supports one of the GFS key goals, which is making green practices more accessible And inclusive. Here is Charles talking about one of those points in this next snippet.Charles Tripp: I think there's a, there's like a historical bias towards number of operations because in old computers without much caching or anything like this, right? Like, uh, I, I restore old computers and, um, like an old 3 86 or IBM xt, right? Like it's running, it has registers in the CPU and then it has main memory and it, and almost everything is basically how many operations I'm doing is going to.Closely correlate with how fast the thing runs and probably how much energy it uses, because most of the energy consumption on those systems is, is just basically constant no matter what I'm doing. Right. Yeah. It's just, it doesn't like idle down the processor while it's not working. Right. There's a historical bias that's built up over time that like was focused on the, the, you know, and it's also at the programmer level.Like I'm thinking about what is the, the computer doing? What do I have control over? Yeah. What's, what, yeah. One, am I able to, but it's only through, it's only through actually measuring it that you gain a clearer picture of like what is actually using energy. Um, and I think if you get that picture, then you'll gain, um, uh, uh, an understanding more of.How can I make this software or the data center or anything in between, like job allocation, more energy efficient, but it's only through actually measuring that we can get that clear picture. Because if we guess, especially using kind of our biases from how we, how we learn to use computers, how we learn about how computers work, we're actually.Very likely to get an incorrect understanding, incorrect picture of what the, what's driving the energy consumption. It's much less intuitive than people think.Chris Skipper: thanks to Charles for breaking it down in really simple terms and for his contribution to the podcast. Another episode that tried to simplify the world of AI and the impact that it's having on the environment is called the economics of ai, which we did with Max Schultze.He joined us to talk about the economics of cloud infrastructure and ai. He challenged the idea that AI must be resource intensive arguing instead for clearer data, stronger public policy, and greater transparency, all values that the GSF hold dear. Let's listen to that clip of Max talking about those principles.Max Schulze: I think when as a developer you hear transparency and, okay, they have to report data. What you're thinking is, oh, they're gonna have an API where I can pull this information. Also, let's say from the inside of the data center now in Germany, it is also funny for everybody listening one way to fulfill that because the law was not specific.Data centers now are hanging a piece of paper. I'm not kidding. On their fence with this information, right? So this is like them reporting this. And of course we as, I'm also a software engineer, so we as technical people, what we need is the data center to have an API that basically assigns the environmental impact of the entire data center to something.And that something has always bothered me that we say, oh, it's the server or it's the, I don't know, the rack or the cluster, but ly. What does software consume? Software consumes basically three things. We call it compute, network, and storage, but in more philosophical terms, it's the ability to store, process and transfer data.And that is the resource that software consumes. A software does not consume a data center or a server. It consumes these three things. Mm-hmm. And a server makes those things turns actually energy and a lot of raw materials into digital resources. Then the data center in turn provides the shell in which the server can do that function, right?It, it's the factory building, it's the data center. The machine that makes the T-shirts is the server and the t-shirt is what people wear.Chris Skipper: Again, it's those analogies that make things easier for people to understand the world of software and the impact it's having on the environment. Also, with that idea of reaching a broader audience, we try to also talk about the energy grid as well as software development as those two things are intrinsically linked. So one of the episodes that we wanna feature now is called How To Tell When Energy Is Green with Killian Daly.Killian explained how EnergyTag is creating a standard for time and location-based energy tracking. Two topics that we've covered a lot on this podcast. This work enables companies to make verifiable clean energy claims, helping build trust across industries. Let's listen to this clip from Killian.Killian Daly: Interestingly, uh, actually on the 14th of January, just before, uh, um, the inauguration of Donald Trump, uh, as US president, so the Biden administration issued an executive order, which hasn't yet been rescinded, um, basically on, uh, data centers, on federal lands. And, and in that they do require these three pillars.Uh, so they, they do have a three pillar requirement on, uh, on electricity sourcing, which is very interesting, right? I think that's. Quite a good template. Uh, and I think, you know, we definitely need to think about like, okay, if you're gonna start building loads of data centers in Ireland, for example, Ireland, uh, 20%, 25% of electricity consumption in Ireland is, is from data centers.That's way more than anywhere else in the world in relative terms. Yeah, there's a big conversation at the moment in Ireland about like, okay, well how do we make sure this is clean? How do we think about, um, uh, procurement requirements for building a new data center? That's a piece of legislation that's on being written at the moment.And how do we also require these data centers to do reporting of their emissions once they're operational? So the Irish government, uh, is also putting together a reporting framework for data centers and the energy agency. So the. Sustainable Energy Authority of Ireland, SEAI, they published a report a couple of weeks ago saying, yeah, they do, you know what they need to do this hourly reporting based on, uh, contracts bought in Ireland.So I think we're seeing already promising signs of, of legislation coming down the road in, um, you know, in other sectors outside of hydrogen. And I think data centers is, is, is probably an obvious one.Chris Skipper: Fantastic clip there from Killian. It also highlights how the work that the GSF is having is having an impact on the political landscape as well in terms of public policy and the discussions that are having in the higher ups of governments.Moving on. We wanna talk about our final episode that we wanna highlight in this episode from the last year, and that's the episode, How to Explain Software to Normal People with James Martin. We ended the year with this episode with James, who talked about strategies for communicating digital sustainability to non-technical audiences, which is something that we try to do here at Environment Variables too. From Frugal AI to policy advocacy, this episode reinforced the power of inclusive storytelling. Let's listen to this clip from James Martin.James Martin: A few years ago, the, the French Environment Minister said people should stop, uh, trying to send so many, uh, funny, funny email attachments, you know? Oh, really? Like, like when you send a joke, a jokey video to all your colleagues, you should stop doing that because it's, it's not good for the planet. It honestly, the, uh, minister could say something that misguided, because that's not.We, you and I know that's not where the impact is. Um, the, the impact is in the cloud. The impact is in, uh, hardware. So instead of, it's about the, the, the communication is repetition and the, the, the, I always start with digital is 4% of global emissions. 1% of that is, is data centers. 3% of that is hardware and software is sort of.They're sort of all over the place. So that's the, the, the thing I, that's the figure I use the most to get things started. And I think the, the number one misconception that people need to get their heads around is the people tend to think that tech is, uh, immaterial. It's because of expressions like the cloud.It just sounds. Like, is this floaty thing rather than massive industry? Ethereal. We need to make it, we need to make it more physical. If, uh, I can't remember who said that if, if data centers could fly, then it would, it would make our, our job a lot, a lot easier. Um, but no, that, that's why you need to always come back to the figures.4% is double, uh, the emissions of planes. And yet, um. The airline industry gets tens of hundreds times more hassle than the tech industry in terms of, uh, trying to keep control of their, of their emissions. So what you need is a lot more, uh, tangible examples and you need people to, to explain this impact over time.So you need to move away from bad examples like. Funny email attachments or The thing about, um, the keep hearing in AI is, uh, one, one chat GBT prompt is 10 times more energy than Google. That may or may not be true, but it's a bit, again, it's a bit of the, it's the wrong example because it doesn't, it doesn't focus on the bigger picture and it can Yeah, it kind of implies, yeah, and it can make people, if I just, if I just like reduce my, my, my usage of this, then I'm gonna have like 10 times the impact I'm gonna.You know, that's all only too, that feels a bit kind of individual in a bit like individualizing the problem. Surely it does, and, and it's putting it on people's, it's putting the onus on the users, whereas it's, once again, it, it's not their fault. You need to see the bigger picture. And this is what I've, I've been repeating since I wrote that, uh, that white paper actually, you can't say you have a green IT approach if you're only focusing on data centers, hardware or software.You've got to focus. Arnold all three, otherwise. Yeah, exactly. HolisticallyChris Skipper: With that, we've come to the end of this episode. Well, what a year it's been for Environment Variables, and we'll just take a look at some of the statistics.Just to blow our own horn here a little bit. We've reached over 350,000 plays. Engagement and followers to the podcast have gone up by 30%, which indicates to us that Environment Variables really matters to the people that listen to it. And it's raising awareness to the need to decarbonize the software industry.Looking ahead. We remain committed to the foundation's vision of changing the culture of software development.So sustainability is as fundamental as security or performance. Year four, we'll bring new stories, new tools, new opportunities, new people hopefully and all in an effort to reduce emissions together. So thank you for being part of our mission, and here's to another year of action advocacy and green software innovation.And now to play us out is the new and improved Environment Variables podcast theme. Hey everybody. Thanks for listening. This is Chris, the producer again, just reaching out to say thank you for being a part of this community and a bigger part of the GSF as a whole. If you wanna listen to more episodes of Environment Variables, please head to podcast.greensoftware.foundation to listen to more,or click the link below to discover more about the Green Software Foundation and how to be part of the podcast as well.And if you're listening to this on a platform, please click follow or subscribe to hear more episodes of Environment Variables.We'll catch you on the next one. Bye for now. Hosted on Acast. See acast.com/privacy for more information.
Open Source Carbon Footprints
Chris Adams is joined by Thibaud Colas; product lead at Torchbox, president of the Django Software Foundation, and lead on Wagtail CMS. They explore the role of open source projects in tackling digital carbon emissions and discuss Wagtail's pioneering carbon footprint reporting, sustainable default settings, and grid-aware website features, all enabled through initiatives like Google Summer of Code. Thibaud shares how transparency, contributor motivation, and clear governance can drive impactful sustainability efforts in web development, and why measuring and reducing emissions in the Python ecosystem matters now more than ever.Learn more about our people:Chris Adams: LinkedIn | GitHub | WebsiteThibaud Colas: LinkedIn | WebsiteFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterResources:Wagtail CMS [01:46]Web Almanac | HTTP Archive [08:03]Google Summer of Code [11:07] Wagtail RFCs [19:51] A Gift from Hugging Face on Earth Day: ChatUI-Energy [27:55]PyCon US [36:07]Grid-aware websites - Green Web Foundation [39:22] Climate Action Tech [41:07] Agent Bedlam: A Future of Endless AI Energy Consumption? - My Framer Site Here's how re/insurers can curb GenAI emissions | Reinsurance Business If you enjoyed this episode then please either:Follow, rate, and review on Apple PodcastsFollow and rate on SpotifyWatch our videos on The Green Software Foundation YouTube Channel!Connect with us on Twitter, Github and LinkedIn!TRANSCRIPT BELOW:Thibaud Colas: If you get your contributors to work on high value and high impact things, that's the best way to motivate them. So that's kind of the idea here is, formalize that we have a goal to reduce our footprint. And by virtue of this, we, you know, make it a more impactful thing for people to work on by having those numbers, by communicating this specific change to images, here is the potential for it to reduce emissions.Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.I'm your host, Chris Adams. Hello and welcome to Environment Variables, where we bring you the latest news and updates from the world of sustainable software development. I'm your host, Chris Adams. If you want the way we build software to be more sustainable and more inclusive, one way to improve the chances of this happening is to make it easier to build it that way,so building greener software goes with the grain of the software framework you're using. And one way to do that is update the defaults that prioritize accessibility and sustainability in the framework itself. One of the people I've seen who really exemplifies this idea and this approach is my guest today, Thibaud Colas,a lead developer at the software agency, Torchbox, the current president of the Django Software Foundation and the product lead at the popular Wagtail Content Management System, which is also built on top of Django. The Wagtail CMS powers sites like the NASA Jet Propulsion Labs website, the University of Pennsylvania's website, the Tate Gallery, and even the main NHS website in the UK.So while it might not have the same coverage as WordPress, which covers more than a third of the internet, still powers a large number of, a number of large sites, and changes made in this framework can have a decent reach. So changes made here are worth discussing because the Wagtail CMS docs, in my view, are probably the most advanced talking about sustainability for any open source CMS right now.And there's a clear link between sustainability and embodied admissions of the hardware that you actually, that people need to use to access your websites too. And with that in mind, you can see it's got some of the most developed accessibility features as well. But we're getting ahead of ourselves though, and Thibaud is in a much better place to talk about this than me.So Thibaud, thank you so much for joining us. Can I give you the floor to introduce yourself for our listeners?Thibaud Colas: Hi. It's my pleasure, Chris. Thank you for having me. I'm Thibaud, my pronouns are he/him. And, yeah, I'm the product lead, for the Wagtail CMS at Torchbox. Wagtail is an open source project and products, and Torchbox, we are the original creators of the project and main contributors. And, yeah, as product lead I helped shape the work of Torchbox on Wagtail and of other contributors as well. And, as president of the Jingo Software Foundation, I have similar responsibilities for the Django Project. Django being a big Web framework, one of the biggest on Python. Just to give you a sense of scale, Wagtail, that's on the order of 10 to 20,000 sites out there. And Django, we're talking half a million to a million projects.Chris Adams: Cool. Thank you, Thibaud. And, Thibaud, where are you calling me from today? Because, I,Thibaud Colas: I'm in Cambridge, UK. got started on Wagtail way back in New Zealand,but travels took me back to Europe and UK. I'm from France originally.Chris Adams: Oh, cool. Alright, thank you for that. So I'm Chris Adams. I am the co-chair of the Green Software Foundation Policy Working Group. I'm also one of the, we're also, we also have show notes for this.So all the projects and links that we discuss will be available. So in your quest to basically develop better sustain sustainable software engineering skills, that will all be available for this. So we look up podcast.greensoftware.foundation to find that. Alright, Thibaud, we've got a bunch of questions to get through.Shall we start?Thibaud Colas: Yeah, sure.Chris Adams: Okay. Alright. So one thing that I, that really came up on my radar a few years ago when I saw this, was that Wagtail was one of the few, one of the only projects I've seen so far that actually tried to put together a kind of carbon footprint inventory of all over the websites that it's responsible for.And I remember the posts and we'll share a link to this explaining some of this and some figures for this. Like "we reckon that Wagtail was kind of responsible for around like more than 8,000 tons of CO2 per year from all the sites that we run." Could you maybe talk a little bit about, basically the approach you took for that and why you even did that.'Cause there's probably a few discussions about decisions you had to make and trade offs you had to choose between model uses and coming up with numbers and all that. But maybe we go from the beginning about why you started that. Then we can dive into some of the details.Thibaud Colas: Yeah. Yeah. Well, simply enough, you know, when you start to think about the impact of technology on what we build, as developers, at least we love to try and quantify that impact. You know, put some figures on there. And the carbon footprint of websites, well, when you think of the sites, there are lots of components.There's things that happen in the browser, things that happen server side. And when I say server side these days, you know, the infrastructure is quite varied and somewhat opaque as well. So yeah, server side. So when it comes to Wagtail, with it being an open source project, people are, it's quite interoperable with all sorts of databases and file storage and web browsers obviously. So it becomes quite tricky toactually put a number on the emissions of one site. And I guess that's where we started at Torchbox specifically trying to quantify the emissions of our clients for 50 to a hundred websites. And from there, you know, you realize that, it makes lots of sense to try and do it for the whole white tail ecosystem so that you can make hopefully decisions for the whole ecosystem based on sites out there. So yeah.I think it was back in 2023 that we did this first, and there were definitely lots of ways back then to quantify sites' emissions. We didn't necessarily reinvent any, but we tried and understand, okay, when we have little knowledge of those thousands of websites out there, which methodologies should we be referring to when we try and put those figures together? So I say specifically methodologies because I think that's one of the potential pitfalls for developers starting in there. They assume that, somewhat like performance, you canhave quite finite reproducible numbers, but we're just not there yet with the carbon footprint of websites.So I think it's really important that you combine. So in our case, you combine web sustainability guidelines, related methodologies, so it's called sustainable web design model, and that you also combine things that look more closely at the servers, you know, CPU and resource use,and also other aspects in the browser.Chris Adams: Okay, cool. And one thing that I actually quite appreciated when you did this or when, the, you know, the team you are part of did this, was that you, yeah you shared all these numbers, but you also shared the underlying spreadsheets and the working so that other folks who might be running projects themselves can use as either a starting point or even possibly challenge and propose maybe improvements as we learn more about this because we know thatit's a difficult field to kind of navigate right now, but it is getting a bit easier, and as we learn more things, we are able to kind of incorporate that into the way we kind of model and think about some of the interventions we might make to maybe reduce the environmental footprint or improve it basically?Thibaud Colas: Yeah. Yeah, it's a, you might actually be aware of a project, Chris, the HTTP Archive's Web Almanac. They reviewed the whole of the Web on the other of 20 million websites every year, and they produced numbers based on this data set of websites. So that's kind of, I suppose what I tried to follow with this methodology as well, of sharing our results to the fullest extent so that other people can verify the numbers and potentially also put same numbers togetherfor their own sites, individual sites, or also site ecosystem. So, you know, Wagtail, it's a CMS among many other CMSs. There's lots of competitors in thatspace and nothing would make me happier than seeing other CMSs do the same and hopefully reusewhat we've spent time putting together. And yeah, obviously when we do this once for, Wagtail, we can try and do it also for Django.So there's also these benefits of, across the whole tech stack, having that kind of methodology more nailed down for people who make those decisions. You know, like product level decisions.Chris Adams: Oh, okay, cool. And just like we have release cycles for presumably new websites or like new CMSs and everything like that, as we learn more, we might be able to improve the precision and the accuracy of some of this to refine the assumptions, right. And, you know, many eyes make bugs shallow. So Drupal folks, if you're hearing this, or WordPress folks, yeah.Over to you basically.Thibaud Colas: Exactly. And you know, definitely the methodologies evolve over time. So one of the recent ones I really like is how, with Firefox browser, you can measure the CPU usage to render a single page.And just that is becoming so much more accessible these days that we could potentially do it on every release of the CMS.Chris Adams: Cool. Well, let's come back to that because this is one thing that I found quite interesting about the, some of the work that you folks have been doing is not only were you starting to measure this, but you're looking at actually options you can take to maybe set new defaults or improve some of this stuff.And, as I understand it, Wagtail, you've had some luck actually finding some funding and finding ways to basically cover the cost of people to essentially work on this stuff via things like the Google Summer of Code and things like that. Maybe you could talk a little bit about some of that, because as I understand it, you're in year three of successfully saying, "Hey, this is important.Can someone fund it?" And then actually getting some funding to pay people to work on this stuff.Thibaud Colas: Yeah. Yeah. Well, yeah. So, once you have those numbers in place as to, you know, how much emissions the sites out there produce, try and refine it down to a few specific aspects of the sites that, you know, you go throughthe quick wins, you figure out what you have the most leverage over, and then you realize there's this and that concepts that are potentially quite fundable if youknow just how to frame it and who to talk to. And we, as Torchbox, we have quite a few clients that care about the footprint of their websites,but it's definitely also a good avenue. The Google Summer of Code program you mentioned, it's about getting new people excited with open source as new contributors in the open source space.It's entirely funded by Google. And essentially Google, they trust projects like Wagtail and Django to come up with those ideas that are, you know, impactful, and also sensible avenues for people to get up to speed with open source. And so, yeah, we, it's been three years now that we've done this with a sustainability focus where we try every year to have an idea in this space.And I think it's quite interesting as an option because, few people that come to open source, you know, early in theircareer are aware of sustainability. It's quite a good, opportunity for them to learn something very new, even if they might already know the technology like Django and Wagtail. And for us, it allows us to work on those concepts that, you know, we saw the data, we saw the promise. So I think, the first year we did 2023, we looked at image optimization.It's actually quite a big thing for a CMS, in my opinion at least, that, you know, people wanna add lovely visuals to all of their pages and you know, maybe sometimes there is a case for fewer images if you want to lower the footprints. But it's definitely also a case where you have to have images, you want them to beas low footprint as possible. So for that specific project, we were joined by two contributors, who helped us. One worked on the AVIF support in the CMS. AVIF being one of the newer image formats that promises to have a lower file sizethan the alternative. And the other one helped us essentially make, the APIs we have in Wagtail to generate multiple images, make that more ergonomic. So you'd be able to generate, say, three different variations of an image and then only send to the user the one that fits the best for how the image is displayedso that hopefully it's smaller.So it's this responsive images concept.Chris Adams: Oh, I see. So you're basically are. It may be that the server needs to generate some of these images 'cause you don't have control over who's accessing your website. But when someone's accessing something with maybe a small, like a touch device or something, rather than send this massive thing over the wire, you can send something which is appropriately smaller.So it might take up less space inside the memory and the DOM and less over the wire as well, right.Thibaud Colas: Exactly. You were talking about the grain of Wagtail. Wagtail has very few opinions as far as how you create the pages, but we definitely try and leverage the grain of HTML, so this responsive images pattern is quite well put together in HTML and Web standards and, yeah, really happy with the results.Honestly, I think for the specific trial sites we rolled it out, it was on the other of 30% lower page weight and, for the Wagtail web at large, like every year we see the improvements in those, audits about how much usage there is of modern image format, how much usage there is of responsive images.We see the figures improve. So, really cool.Chris Adams: Cool. We should actually share links to some of these things as well, actually. 'Cause one of the wonderful things about working with an open source project is you can say, well, if you want this to be a norm, then is the PR you could copy basically, right.Thibaud Colas: Yeah. And something like AVIF support, I'm sure we'll talk about it at some points. Definitely. You know, we couldn't create the AVIF decoders and so on ourselves, so we've been relying on the Python ecosystem at large. And yeah. Now those things are in a place where lots of projects have those decoders where available.Chris Adams: Cool. Are there any things, are there any other like, so that's, that was year one and this is year three and I think I can probably share with you is that, so we're a nonprofit organization. We publish a library called CO2.js. We've added, we've managed to get some funding from Google once for the Google Summer of Docs, not Google Summer of Code, where they actually funded us to make some of this library a bit easier for other people to use. And we found that quite helpful because that's been one of the ways people come to this for the first time is they use a library called CO2.js. And that wasn't something we could prioritize. So it's, kind of nice.It just, it would be nice if there's more organizations funding this kind of work rather than just like one Web giant. Like it's nice that Google is doing this, but if you too work in a large tech company and you wanna actually fund this stuff or make it easier for your engineers to do this, then,yeah, it's right there, folks. Okay. So maybe I can ask about some of the other years that you have running, like is there anything else that you'd like, like to talk about or draw people's attention to for some of the other ones?Thibaud Colas: Google Summer of Code is a three month program, but lots of those things, to be honest, they keep chugging along in the backgroundfor quite a while and making improvements. So, year two of this, we worked on the starter project for Wagtail. So a starter website where, just like as you mentioned earlier, the defaults, trying to make sure that it's easier for people to get a site up and running that has all of the right things in place to be low impact.So that time, a contributor, Aman Pandey, helped us with the designsas well as the coding of these templates. And, just from the get go, the idea was let's measure the designs even before they touch a Web browser. Let's make sure that we understand all of the, you know, newer standards, like the Web specific guidelines that those designs have that baked in so that when you generate thesites, you are guaranteed better results. so this template, this project template's still in progress, but the designs at least are super promising. And year three, so year three,starting as of this week, just to be clear, is grid awareness. So grid awareness is a big term. Essentially it means looking at ways that, as the website loads in, your browser, it'll be optimized based on the carbon intensity of your computer and your local grid electricity. So what that means is if it would take, produce lots of emissions for the site to load in your browser, we try and make the website optimized for the emissions to lower. And yeah, so our contributor for this, Rohit, he's been around the Wagtail community for a bit and has this interest in sustainability.And again, I think a great example of something that will tangibly help us reduce the impact of Wagtail websites out there and also make more developers aware of those patterns and, you know, the underlying need for those patterns.Chris Adams: I am glad you mentioned the names actually. 'cause, on the initial year, I was working closely, I was working with Aman Pandey and I think one of your colleagues might be working with Paarth. So, hi Paarth. Hi Aman. I hope you're listening. It's really nice to actually see this. 'Cause these were people who are, like you said, early career didn't get that much exposure, but honestly compared to like the industry, they're relative experts now. And that might say more about the state of the industry is right now, but is, this was something I actually found it quite nice working with someone who was relatively young, who was actually really keen and honestly worryingly productive, did make me worry a bit about my own job going forward.But yeah, this was one thing that was, really cool from that actually.Thibaud Colas: Paarth and Aman are two of the mentors working with me on this Wagtail websitesproject this year. So this is also the other goal of this Google Summer of Code program is retaining those people in the open source world and, yeah, definitely, you know, we are at a point now where we have more and more of those people coming to open source with that realization. There's way more room for this to happen on other projects like Wagtail, but, baby steps. Chris Adams: Oh wow. I didn't realize that you actually had, there was a kind of program to kind of build like, I guess like invest in, provide some of that leadership so people who prioritize this are able to kind of have a bit more of influence inside that project, for example.Thibaud Colas: Yeah, exactly. Well, you know, open source, we have, we have very different incentives compared to the corporates and, yeah, for profit world. So we don't necessarily have, super clear ways to retain people, but definitely people who are interested and have the drive, like we try and retain them by having them move from contributors first time to repeat maintainers, mentors and so on.Chris Adams: Oh, cool. All right, so that is a nice segue to allow us to talk a little bit about, I guess, taking ownership of carbon emissions and like the strategies that you have there. Because, one thing we should add into this list is that there's actually a roadmap for Wagtail specifically.I think it's, is RFC 90 or is there a particular term for like a request for comments or something that you folks use to kind of talk about governance and talk about what you prioritize in this? Thibaud Colas: It's a bit of a running joke. In Python they have the PEPproposals, Python Enhancement Proposal, and in Django they have the DEPs. People have been wondering if Wagtail should have the WEPs,but right now we just have RFC, requests for comments.And Yeah. ,It's just a super, like, simple way for us to invite.It's really rather than, you know, create those governance, or technical architecture decision. Go documents, in, private chats, put them in public, and then invite feedback from others. So, you know, we've had this RFC for, couple years now, I believe. I got some good support from one of the experts out there on open source governance, Lauri Apple.She coached me through, you know, trying to build up community momentum and also trying to find ways to make this reusable again beyond Wagtail and yeah, so this RFC, like, if you're deep in this space, it's nothing super special. It's about building awareness, finding opportunities for fundraising,working on the right concepts, but I think it's quite unique for open source projects to have that kind of clear, like direction for those things. Open source projects don't even necessarily have a roadmap of any kind to start with. And one on specific topics like this I think it's really important. I think there's something Lauri says often, which isif you get your contributors to work on high value and high impact things, that's the best way to motivate them. So that's kind of the idea here is formalize that we have a goal to reduce our footprint. And by virtue of this, we, you know, make it a more impactful thing for people to work onby having those numbers, by communicating this specific change to images, here is the potential for it to reduce emissions.Chris Adams: Oh, I see. Okay. So I've just followed the link to the RFC that you have here, and there's actually quite a lot of stuff here. So I can see a link to the free green software for practitioners course for people who don't know that much about it, I can see that Wagtail itself has a sustainability statement.So like this, these are our priorities. So there's some immediate kind of explicit statement that this is something you care about. And then as I understand it, there's some references to other things. So there's the prior work, with the GSOC, Google Summer of Code. There's references to the W3C Web sustainability guidelines and a bunch of stuff like that.And there's few other. We'll show a link to this because I think it's actually a really good example for other people to be aware of or see, like, this is what a relatively large mature project does, and this is what it looks like when they start prioritizing this. Because, yeah, there are some, there are some organizations that are doing this quite well.I know that there is a .NET based CMS that I've totally forgotten the name, Umbraco CMS, also have some quite strong, have also quite advanced in this. And they're another good example of this, but there's kind of, when you talk about, okay, prioritizing this and responsibility, there's a whole question about, okay, well,whose job is it or who's responsible for this? Because you are building a piece of software and like you might not get that much control over who adopts the software, for example, like I think when you shared this breakdown, we saw like a, I think you mentioned there was one Vietnamese website, Vinmec, that was like making up like a third of the reported emissions. Thibaud Colas: Put me in touch.Chris Adams: Yeah.Thibaud Colas: Yes. So this is a very, with the caveat that carbon accountingisn't my expertise, you know, in the corporate world, we have the very clear, greenhouse gas GHG protocol, and scope one, scope two, scope three standards. And in that corporate world, I think, there's this, I think scope three, category 11, use of Chris Adams: use of products. Thibaud Colas: The use of, it's worse than that. It's use of sold products.Chris Adams: That's it. Yeah. Sold. Yeah.Thibaud Colas: So if you're not a corporation, we're not a corporation, wagtail, we, have about 20 contributors on thecore team. And if you don't sell your product, which standards are you meant to be using, then, to decide essentially which, which emissions we should be reporting on?So the disfigure of the carbon footprint of Watta on the order of five to 10 thousand tons a year. That's assuming, you know, we take some ownership for this usage of Wagtail and of the websites built with it. And it's actually, I think, quite tricky to navigate in the open source world.Understanding, which standards of reporting are, helpful because, you know, in some respects, people who shop for a website builder or CMS or any tech really kind of expect specific standards to be met. You know, you mentioned having a sustainability statement. No one's expecting that just yet in the open source world, but we definitely want things to move that way. And we have to, you know, make sure that when we create those figures they are somewhat comparable to other projects. So, yeah, I guess for Wagtail, you know, there's the fact that you don't control who usesit and you don't control how they use it, either. So, if someone wants to, you know, make a site that's partly bigand it's partly popular in some country, maybeChris Adams: Yeah. Thibaud Colas: adult entertainment websitesthat don't have any. Chris Adams: Does PornHub go on WordPress' ledger? Right? is it on their accounts? Yeah.Thibaud Colas: Exactly. We have a few like this in the Wagtail and Django world and, you know, technology, you know, it's open source license. We have no interest in taking any kind of control or having a more contractual relationship with those projects, but we still need to navigate how to account for their use essentially. What actually got me started on this, Chris,I think it's worth I mentioned, is the work of Mozillaand Mozilla Foundation. They were the first ones I saw, I think back in 2020 reporting the use of Firefox browseras part of the emissions of Mozilla. And it was, I think it was 98% of the emissions of Firefox were like, sorry, the emissions of Mozilla came from Firefox.And it just got me thinking, you know, for Wagtail and Django, obviously it's a similar type of scale.Chris Adams: Also with Firefox, the browser, like you don't necessarily pay Firefox to use it, but you may be paying via the fact that your atten, you know, you kind of pay in your attention. And the fact that when you click on a search, an ad in Google, one of the search services, Firefox is being paid that way.So you're not actually making a direct monetary, like you're not giving them money directly, but there is payment taking place and changing hands. And this is one thing that is actually quite difficult to figure out. Like, okay, how do you represent that stuff? Because like you said, it's not sold per se, or you're not paying in money, but you may be be paying in something else, for example.And, it's almost like you know this, I mean it's, I guess it's a good thing that you do see some of these protocols for reporting being updated because they're not necessarily a good fit for how lots of like new business models on the internet actually work, for example.Thibaud Colas: Yeah. And it's really important for us to get in this space as, open source technologists, I believe. Because I mentioned procurement. Definitely the expectations are rising in Europe, in particular in the EU, on the carbon impact of technology. And I think it's quite a good opportunity for open source.You know, we have very high transparency standards for us to meet those requirements, not necessarily to lower the emissions dramatically, but at least be transparent on the impact of the software.Chris Adams: Yeah, I mean, this is actually, you touched on quite an interesting thing, which is both a link to some of the Mozilla work, but also, in the kind of AI world, which is kind of adjacent to us as like webby people. There's, I know that Mozilla provided a bit of funding to Code Carbon, which is an open source Python library for people to understand the environmental footprint of AI training, and I think these days some inference as well via the kind of Energy Score AI,a project that they have with hugging face, for example. So the, you know, one of the reasons you have that is because, oh God, I'm gonna murder the name. There's a French supercomputer, Jean Paul. Jean. Oh, do you know the one I'm talking Thibaud Colas: No, I don't actually. Chris Adams: Okay. So maybe the thing I'll do is I'll give you a chance to respind to thiswhile I look it up, but I do know that one of the reasons we have any numbers at all for the, environmental footprint of AI was because there was a, you know, publicly funded, supported supercomputer with some work by, some people at hugging face, I forget, the Bloom model. Thibaud Colas: Oh, The Bloom Chris Adams: yeah. Yeah, exactly. That, we have these numbers and there was a paper produced to actually highlight this.Because the supercomputer, the people who are running the super supercomputer are able to share some of these numbers where it's, where traditionally we've had a real challenge getting some of these numbers back. So that's one place where having some open examples, at least give us something like a proxy in the face of like not quite deafening silence from groups like Open AI and Anthropic and stuff like that, but we're not seeing that much in the way of numbers. And given that we're seeing this massive build out, it's definitely something we are, I'm very glad. It's useful to have like open source organization, open source projects, and some, other ways of funding this to actually at least create some data for us to have a kind of data informed discussion about some of this.Thibaud Colas: A hundred percent. Yeah. This Bloom large language model is, I think really, it's really essential for us for, to see this research being done because then when, you know, people talk about adding AI in a CMS or in their Django projects, we can point them to understanding like, you know, what the potential increase in the carbon footprint of the project is, and yeah. You know, in the AI world, there's this whole debate about what open source means for AI models.Definitely it's not, there's lots of gray areas there, but if you wanna reuse their research, it's much easier if there's just a underlying philosophy of open source and open data in those organizations.Chris Adams: Jean Zay, that's the name of the supercomputer in France, which has this, there's actually ones in Boston as well. There is one over there. And the, in the US NREL, the National Renewable Energy Labs folks, they did, they've shared a bunch of information about this as well when they've got access to this, and this is actually providing a bit of kind of light to a discussion, which is mo mostly about heat so far it seems. So that's actually quite kind of useful. You've just made me realize that later on this year, this might be one of the angles that we might see people talking about the use of AI for actually drilling for oil and gas and other kind of stuff which is not great for climate because, NE, which is a nationally, it's a state owned.NE is a state owned energy company in Italy. They are one of the few people who actually have a publicly owned supercomputer. And because Italy is one of the countries that signed the Paris Agreement, there's currently a whole law court case about essentially suing NE to say, well, if you are state owned and, this is, and you've signed this, why the hell are you actually now using AI to drill for oil and gas, for example? And this might be one of the ways that we actually see some numbers coming out of this. 'Cause since 2019, we know that there are companies which are doing things with this.But for example, we know that say companies like Microsoft are involved in helping use these tools to kind of get oil and gas and fossil fuels out this out of the ground. But there's not much visible, there's not much out there right now since the press release has stopped in 2019, and it feels like it's a real gap we have when we talk about sustainability and technology, and particularly AI, I suppose.Thibaud Colas: Yeah, that's really interesting for us to consider for Wagtail as well because, you know, we talk about the carbon footprint of the websites, but it's also important to consider what the website might be enabling or, you know, in positives and negatives. And, yeah, even beyond websites, when I've tried to, you know, take my work from Wagtail to Django and even the Python ecosystem at large, with Python, you have to reckon with the footprint of web services built with Python, but also of all the data science that supports this same, you know, oil and gas industry. So it's tricky.Chris Adams: Yeah, I mean, we've just saw, like we're speaking on the 4th of June and we had there two days ago, and, we've seen like massive drone attack wiping out like a third of Russia's bomber fleet. Right. And that was basically like some Arduino drone pilot software was one of the key pieces that was used inside that.And it's not necessarily like the open source developers, they didn't build it for that. But we are now seeing all this stuff show up and like we haven't figured out ways to kind of talk about where the responsibility lies or how you even think about some of this stuff. Because yeah, this takes the like,we might have words like dual use for talking about these kind of technologies, but in a world of open source, it becomes much, much harder to figure out where some of these boundaries lie and how you actually, well, I guess, set some norms. I mean, maybe this is actually one thing. Yeah, I'll leave you some space then I wanna ask you a little bit about, you mentioned the wider Python ecosystem,'cause I know that's something you've actually been having some conversations with as well.Thibaud Colas: Yeah, well, connecting the dots, you know, it's also the usage, but also as contributors, you have to consider that maybe there are only so many people in the Wagtail or Django worldthat are responsible for how the tech is put together. So maybe in some sense you do share some kind of responsibility personally for the tech you produced out there, even though you don't control how people will ue it. Which is, you know, a whole dimension of how you or how much you take ownership of that. And yeah, in the Python world more widely, you know, Python is the most popular language out there. Even if it might not be the most performance, even if there might be simpler languages that help you get more optimized, lower emissions software, people are gonna use Python in all sorts of ways.And some of them, you agree with it, some you don't. I think that's one of the, you know, realities of open source contribution you have to be aware of.Chris Adams: You've actually said something quite interesting about, okay, yeah, there's a limited number of people inside the Wagtail community and you've been able to have some success in like helping set some norms or helping help kind of set some directions there. And there's maybe a slightly larger group, which is like in the Django land, like when with you in the kind of acting as a president now, I know there's some interest that you have there. And there's groups that I've been involved with, right. But you also mentioned that there's a kind of wider Python ecosystem there, people talking about this, I mean, is this, if someone is actually looking, let's say they're coding on Python, they wanna find out who else is doing this. Like, is there someone you'd look, you'd point people to?Or are there any conversations you're aware of going on right now? Thibaud Colas: The Python ecosystem is big.So one of the big challenges to get started with is just putting enough people together to have those discussions. I have tried on the Python discussion forum. I think it's, "who's working on sustainability in Python?" is the thread I put together. And I guess, I think, to me, what's important at this point is just getting tech people, you know, aware of the fact that we have this climate change challenge and that they can do something about it. And then, you know, realizing that open source has a role to play and as open source contributors we can very much move the needle. So in the Python world, you know, it's being so big and the uses being so different, there are lots of ways to help by working on the performance of Python itself, but there are also lots of ways to help outside that. Even something as simple, you know, as the Python Software Foundation trying to quantify their own organization footprint or the footprint of a conference like PyCon US, that can go quite a long way, I think.Chris Adams: Cool. Thank you for that. Actually, I'm really glad you mentioned PyCon US because there were a number of, talks that I heard other people on other podcasts talking about it. They were really pleased to see. So there, there seemed to be some latent interest there. And what we could do is we'll share a link to some movie videos that were up there, because yeah, I was pleasantly surprised to see them when I saw PyCon's videos come up on YouTube because,wow, it came up really fast. Like there is, you know, really nice things about like design pressure and how to think about like your code. But yeah, there's a few people saying who are totally new to, like, I've been, you know, the existing green software field, there are people who seem to be quite new to it talking about this.So that's, that's encouraging.Thibaud Colas: Yeah. And in some ways, you know, AI, the whole negative impacts of AI, the whole like problem and kind of forms for our whole industry, but with, you know, LLMs being so costly to train, so, you know, energy intensive to train in some ways it also helps people understand better the implications. Yeah, exactly.And just build up awareness. So I think what you're referring to at PyCon US is, the work onChris Adams: Yes. Yeah. Thank you.Thibaud Colas: Yeah, Machine Learning in Python, quantifying the energy impact of that and LM specifically. And, yeah, people like him, you know, he's involved with Google Summer of Code for Django, sodefinitely in the position. Yeah. Yeah. And, I think it's just a, it's a matter of, for us as open source people of, nurturing, you know,those areas of expertise. Making sure we have those people having the conversations and, yeah, also sharing them in a wider sphere of the industry at large.Chris Adams: And I suppose, I mean, one of the other things is that pretty much all the people you've mentioned who are going through the Summer of Code stuff, these are people who are in one of the regions which you're seeing like 50 degrees Celsius heat waves and stuff like that. It, there's kind of like moral weight that comes from someone talking about, they say, "Hey, I'm experiencing this stuff and I'm in an area which is very much exposed to this" in a way that if you are in some way, you are somewhat insulated to, from a lot of these problems, it doesn't, it might not carry the same weight actually.Wow. Thank you. I hadn't realized that.Thibaud Colas: I really liked this parallel that one of my colleagues at Torchbox put together about our work in accessibilityand the war in Ukraine, talking about other big topics, where, you know, practically speaking, there is a war, it's horrendous, people are getting maimsand they don't necessarily have the same life after.And if you invest in accessibility, means being be better able to support. people who go through the conflict with major harm and, yeah, I think it's quite important for us in open source, you know, when I, when Lauri talks about high impact contributions, to hark back to those values you might have about helping others and realize the connection, even though, you know, there are quite a few layers between me, a human contributor, and a Wagtail website, we can have that impact.Chris Adams: Well, I guess that's the whole point of the web, right? The Web is, this is for everyone like London Olympics, Tim, I mean, Tim Berners-Lee, his like massive thing. "This is for everyone" being a kind of, okay, we're getting a bit teary and a bit, get a bit carried away ourselves and we're running short on time, so I should probably kind of wrap this up before we go too, far down that rabbit hole. Thank you so much for coming on for this. As we wrap up, are there any projects or things that you would like to draw people's attention to that you found particularly exciting of late before we wrap up?Thibaud Colas: Definitely. I'd recommend people check out this Grid-Aware websites work that the Green Web Foundation puts together. Chris Adams: I did not tell him to say that. Okay. Yeah. Thank you. Thibaud Colas: He did not. But you know, it is actually really impactful in my mind to put together multiple CMS partners through this project and, on a personal basis, this type of project,I was really skeptical of the benefits at the beginning, and it's really interesting to get your thought process starting on, yeah, like tangible ways to move the needle on new sites, but also existing ones. So specifically the work we're doing for this project, Google Summer of Code. I think we'll have results to show for it in about a month's time and hopefully it's reusable for other people.Chris Adams: Yeah, there's actually, okay, now that you mentioned this, I've just gotta touch on it. There is actually a grid aware SDK, which is currently out there, and you can think of Grid Aware as being very aware, kind of like quite comparable to carbon aware, basically, but with a few extra different nuances.The thing I should probably share is that this is actually work that has had a degree of funding from SIDN, which is a Dutch foundation that has been trying to figure out what to do in like greening the internet. So there are pockets of interest if you know who to speak to. And hopefully we should see more of these things kind of bearing fruit over the coming months.Alright. I don't wanna spend too much time talking about that, because we're coming up to time. Thibaud, thank you so much for giving us your attention and time and sharing your learnings about, both in the word of Django, Python and in Wagtail. If people are curious about what you're up to, where should people look?Thibaud Colas: I had, simply enough I'd love for them to join yet another thing you didn't ask me to mention, which is the Climateaction.tech Slack. this is my favorite place to, you know, have this tight-knit community of tech people working on this stuff. And just DM me there. And I'll be very happy to answer any questions about any of this or just get you started with your own projects. For me, specifically, otherwise in the Wagtail world, the Wagtail Newsletter is a good place to have this work come to you on a weekly basis. And, yeah, just LinkedIn otherwise.Chris Adams: Brilliant. Thank you so much for this. I hope you have a lovely day and yeah. Hopefully we'll cross paths again soon. All right. Take care of yourself.Thibaud Colas: Pleasure, Chris.Chris Adams: Cheers. Okay, bye. Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation in any browser. Thanks again, and see you in the next episode. Hosted on Acast. See acast.com/privacy for more information.
Cloud Native Attitude
Environment Variables host Anne Currie welcomes Jamie Dobson, co-founder of Container Solutions and author of the upcoming book Visionaries, Rebels and Machines. Together, they explore the history and future of cloud computing through the lens of sustainability, efficiency, and resilience. Drawing on insights from their past work, including The Cloud Native Attitude and Building Green Software, they discuss how cloud-native principles can support the transition to renewable energy, the potential and pitfalls of AI, and why behavioral change, regulation, and smart incentives are key to a greener digital future.Learn more about our people:Anne Currie: LinkedIn | WebsiteJamie Dobson: LinkedIn | WebsiteFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterResources:The Cloud Native Attitude: Amazon.co.uk | Anne Currie, Jamie Dobson [01:21]Building Green Software: O'Riley | Anne Currie, Sarah Hsu, Sara Bergman [01:38]Visionaries, Rebels and Machines: Amazon.com | Jamie Dobson [03:28]Jevons paradox - Wikipedia [11:41] If you enjoyed this episode then please either:Follow, rate, and review on Apple PodcastsFollow and rate on SpotifyWatch our videos on The Green Software Foundation YouTube Channel!Connect with us on Twitter, Github and LinkedIn!TRANSCRIPT BELOW:Jamie Dobson: We're loaded up all these data centers, we're increasing data sets, but ultimately no matter how much compute and data you throw at an artificial neural network, I think it would never fully replace what a human does. Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.I'm your host, Chris Adams. Anne Currie: Hello and welcome to Environment Variables Podcast, where we give you the latest news and updates from the world of sustainable software development. So this week I am your guest host Anne Currie. And you don't have the dulcet tones of Chris Adams, you're left with me this week. So we're gonna do something a little bit different this week.I have got an old friend and colleague and co-author, Jamie Dobson in to talk about it. So Jamie is the co-founder and CEO of a company called Container Solutions. And he's the author of the soon to be released book; Visionaries, Rebels and Machines, which I've read, and that's what we'll be talking a lot about.And he's also the, one of my co-authors of a book I wrote nearly 10 years ago called the Cloud Native Attitude, which is about the principles of moving into the cloud. And there's an awful lot in there about sustainability with that, there's a lot we need to talk about around that. And it was actually for me, the precursor to the book that I wrote which came out with O'Reilly last year, with co-authors Sarah Hsu and Sara Bergman, Building Green Software, which as I always say every week,everybody listening to this podcast should read because you'll find it very interesting and it is couldn't be more germane. So today we're gonna talk about those three books, really, and the thematic links between them all, which are really about resource efficiency, building at scale without it costing a ridiculous amount of money or using a ridiculous amount of resources.And also resilience, which is something we're gonna really have to focus on when it comes to running on renewable power. So, let me let Jamie introduce himself and maybe tell us a little bit about his new book, Visionaries, Rebels, I can never remember whether it's Rebels, visionaries and Machines.It's Visionaries, Rebels and Machines. Go for it, Jamie.Jamie Dobson: Visionaries, Rebels and Machines. That's correct. Hello Anne. Thanks for having me on the podcast. And hello to all your listeners. who tune in every week? Yeah. So my name is Jamie. I am indeed the co-founder of a company called Container Solutions. But it's no longer, I'm no longer, I should say, the chief exec,'cause I handed that role over about a year ago, which is probably why, or, you know, it explains why I could find the time to finish writing this damn book. So Container Solutions is a company that specializes in cloud transformation, helping customers, you know, get off whatever infrastructure they're running on now and get onto, you know, efficient cloud infrastructure.And if we do that right, then it's kind of green and sustainable infrastructure, but it's hard to get right, which I'm sure we're gonna discuss today. Anne Currie: Indeed. Yes. Yes. So, so you've got a book that's about to come out, which I have read, but it's not yet available in, the, in the stores, but it will be available on, in all good book bookstores, Visionaries, Rebels and Machines. And I, the reason why I asked you to come on is because I think there are a lot of ideas in there that would, that we need to be talking about and thinking about.So, so tell us a little bit about Visionaries, Rebels and Machines, and then I'll tell you why I think it's interesting.Jamie Dobson: Absolutely. Yeah. So, so Visionaries, Rebels and Machines, we have to start at a point in time. And that point in time is about four or five years ago. And I was asked the question, "what's the cloud?" It was, the person asking me, it was a junior colleague, new to Container Solutions. And, you know, I started to answer, or at least I opened my mouth,and of course I can answer that question, but I can't answer it necessarily succinctly. So I was asked the question, I think probably around about June, so maybe about five years ago today actually. And over the summer period I was thinking, "God, how do you answer that question? What is the cloud?" And so I started to creep backwards in time.Well, the cloud is, you know, there's a bunch of computers in a warehouse somewhere. But what's a computer? And then once I asked that question. Well, computers are things made up of transistor. Well, what's a transistor? And what I came to the conclusion over the summer, was the following:The cloud can only really be understood in its own historical context. And so interestingly, once we got to the point of, you know, answering the question, what is the cloud? The arrow was already flying. You know, there was a, an arrow was shot round about the late Victorian time at Thomas Edison's Menlo Park facility in New Jersey, and that arrow flew all the way through the last century through the web, through cloud computing, and it continues to fly with the rise of artificial intelligence. And so the last part of the book is, okay, now we know what the cloud is and what it does, where might it take us next in regards to artificial neural networks and all of that stuff? So that was the book. The Visionaries and the Rebels of the people who built teams, built teams that were innovative. All of them had psychological safety even though the, that concept wasn't known at the time. And so, these historical figures are not just ancient history, like not just Thomas Edison, but also the Jeff Bezos's of the world, the Reed Hastings's, and the modern figures of cloud computing. The visionaries and the rebels can teach the rest of us what to do with our machines, including how to make 'em sustainable. Anne Currie: And that is the interesting thing there. So I enjoyed the book. It's, it is quite, it is a readable romp. And I very much connect with your, with your initial motivation of trying to explain something that sounds simple, but actually you realize, oh gosh, I'm gonna have to write an entire book to even get my own head around this rather than, you know, 'cause that was true for, well, when we wrote, it's actually a, Cloud Native Attitude, which was the book that we wrote together started off 10 years ago, was pretty much for the same, it was kicked off in the same way. We were, we were saying, well, what is cloud native? What, what are people doing it for, and why are they doing it this way? And quite often, and Building Green Software,the O'Reilly book, which is really germane to this, to this podcast, was again, the same thing. It's what is, what does the future look like for systems to be sustainable? How do we align, and make, what is the future gonna look like? And, where, and that's always seated in the past. What has been successful?How did we get here? Jamie Dobson: Absolutely. So you can't move into the future unless you understand your past. And I think the similarities between the Cloud Native Attitude and Visionaries and Rebels is the tone. So my book deals with horrible things, child poverty, exploitation of people, and the truth is that a reader will put up with that for maybe one paragraph.So if you want to, if you want to teach computing and how it can enslave the human race or not, or how it can liberate them and touch all of these really difficult themes, you've got to do it in a pretty lighthearted manner. And the reason people are saying, "oh, it's a page turner. it's entertaining, it's a bit of a rump,"it's because we focus on the characters and all the things that happens to them. And I think that started with a cloud-native attitude because unless you can speak quite lightheartedly, you so quickly get bogged down in concepts that even for people like us who work in computing and are passionate about computing, it's just extremely boring. And there are some fantastic books out there right now about artificial intelligence, but they're so dry that the message fails to land. And I think I was trying to avoid that. Anne Currie: And you know for, 'cause we wrote Cloud Native Attitude together. But it is, if these, books are ideally a form of leadership. When you write a book, you are either, you are kind of saying, look, this is what I want to happen in the future.You're trying to lead people and explain and reason and inspire. But you have to inspire. If it's boring, you're not gonna lead anyone. No one wants to follow you to the boring location they want to follow you to the exciting location.Jamie Dobson: No. Exactly. And I think the problem is computer people, most of us have been to university, so we're on the academic path. And what happens is you forget to tell stories. So everything becomes about what the research says, "research indicates." So it's all exposition and no narrative. And the problem that is people switch off very quickly, and the paradox is that you don't make your point because you've bored your reader to death.Anne Currie: Yeah. And this is something that's, that comes up for me over and over again in the green software movement that we quite often, we tell the story of it's being, everything being very sad. And everybody goes, "well, I don't wanna be there in that sad world." And, but it's not a sad story. I mean, it is like climate change is a really sad story.It's terrible. It's something we need to avoid. We're running away from something, but we're also running towards something. Because there's something amazing here, which is renewables are so cheap. If we can build systems that run on solar and wind, and a little bit of storage, but not, but much less storage than we currently expect,then we have a world in which there's really loads more power. We can do so much more than we do now, and it's just a matter of choosing what we do with it. It is a, we are not just running away from something. We're running towards something, which is amazing. And, so yeah, we tried to keep that tone.And Building Green Software is designed to be funny. You are. It's the only O'Reilly book. One of, one of my reviewers says it's the only O'Reilly book where you actually get, you laugh out loud whilst reading it. You could read it on the beach.Jamie Dobson: This is exactly why we created a conference at Container Solutions called WTF. What The F is Cloud Native? And it's basically because if you cannot entertain, you'll never get your message across. I've got a question for you, Anne, this wonderful future that we're heading towards, I see it as well. But in the research for visionaries and rebels, there was a big chapter I had on Henry Ford, and in the end it didn't, quite make it into the book, but basically, once Edison had created electricity, then all of a sudden you had elevators for the first time. So the New York landscape did not become a thing till we had electricity because there was a limit on how big the buildings could be. And that exact moment Henry Ford came in with the motorcar, and he was so successful in getting it off the production line cheaply, the beautiful boulevards of New York, of American cities, New York, St Louis, and places like that ended because basically people said, "well, we don't need to be in the city.We can drive to the suburbs." And a lot of historians were saying if Henry Ford had just gone a bit slower, we would've adapted to the motor car quicker and therefore the cities of today would look very different. And one of my concerns with green software is,the speed of which we're moving with data centers and AI is so quick.I wonder if we're having another motor car moment. the future's within grasp, but if we go too quick, might we screw it up on the way?Anne Currie: So I think what you are circling around here is the idea of, it is something that comes up quite often, which is Jevons Paradox, which is the idea that, as you get better at using something, you use more of it, it becomes cheaper, because actually because there's untapped demand.So where there's, where people are going, "gee, you know, I really want to live in a high rise city because then naturally everybody can live together and it will be vastly better for us and we'll prefer it. And therefore we take more elevators and we go up because we've got elevators."And people really want cars. I mean, it's one of the things, I don't drive. but everybody loves to drive. There's no point in, tying green with like nobody driving because they love to drive. And there was untapped demand for it, and therefore it was met. And remember at the time there was really, but back then we didn't consider there to be any problem with using more petrol. We didn't consider there to be any problem with using fossil fuels. And everybody went, "yeah, hooray! Let's use more and more of it."But it did massively improve our quality of life. So I think all green messages we have to say, well, we want the improvement in quality of life, but we also want a planet and we have to optimize both of those in parallel.We can't say that you're trading off. And this, I know that people have a tendency to look down on efficiency improvements, but efficiency improvements are what has driven humanity up until now. And efficiency improvements are so much more powerful than we think. We just don't understand how much more efficient things can get.Jamie Dobson: Yeah. Anne Currie: And therefore we go, oh, well, you know, we, if people have 10 times as many cars or whatever, probably not 10 times as many. Well, compared to back to Henry Ford's days, we've got a lot more cars. We've got a lot more mobility. There is a almost seemingly limitless, demand for cars. But there are plenty of other areas of life where efficiency has outstripped the demand.So in terms of electricity use, household electricity use in the west in the past 20, 30 years, household electricity use, despite the fact that everybody has automated their houses we've got, everybody's got washing machines and dishwashers and tumble dryers and TVs, and electricity use has still gone down.And the reason why it's gone down is because all of those devices appeared, but then became more and more efficient. And efficiency improvements really are extraordinarily powerful. Much more than people realize. And if we force people to put the work in, and it's not free, it requires an enormous amount of work, but if people are motivated and incentivized to make those efficiency improvements, we can do an awful lot.We can get.Jamie Dobson: My suspicion is the world will change. So not many people realize that the car was actually very good for the environment. All around London, my children ask me, what's that thing outside the house?" It's a scraper for your feet, for your boots. And that's because all the streets of London were caked two inch shit deep of horse manure.And at the end of every single street, the way it was piled high. So the public health issues with horses was an absolute nightmare. Not to mention the fact that people used to get kicked in the head or pulled into ditches. Fatalities from horses was, you know, a weekly account in New York City. But so it changed. So once we got the electricity, we got the lifts, the horses went away. My suspicion is right now we cannot run a sustainable culture or city without radically changing things. So, for example, did you ever stop to wonder why is your power pack warm? You know, when you charge your phone or your laptop, why does it get warm? Do you know what the answer to that question is?Anne Currie: No, I don't actually. That's a very good question.Jamie Dobson: There you go. So who won? Who won the battle? Tesla or Edison. So.Anne Currie: Tesla. Jamie Dobson: Tesla did win. So it's basically AC versus DC. What's the best system to have? Well, DC, direct current kills you if you touch it direct current by accident and the voltage is right, you die. But what you feel on the back of your charger is heat, which is a side effect of converting AC back to DC because computer devices don't work on AC because it, the current has to go round and round, like water, in a fountain because that's the only way transistorized things work. So now people are saying, well, actually, arguably we should have a DC grid because globally we are wasting so much electricity because of this excess heat that is produced when we go from AC back to DC. So, and I get the feeling, and do you remember when we were kids, if you put your washing on at three in the morning, you got cheaper electricity.I cannot help but think it's not just about renewable energy, but it's also the way we consume energy to make that more effective.Anne Currie: Yeah.Jamie Dobson: And I think if that doesn't change, I basically think, when Edison arrived, society as we knew it absolutely changed. We had no refrigerators and that changed our behaviors.Now, some people would say, well, you became a slave to the machine. I think that's a little bit too far, but we certainly went into some sort of analog digital relationship with the machines we work, all of which drive efficiencies. I think the next chapter for sustainable energy and computing will be a change in our habits, but I can't, I don't know exactly what they're gonna be.Anne Currie: Oh, that's definitely a thing. It's something I've talked about on the podcast before. It's the mind shift from fossil fuels, which are kind of always on, you know, easy to dispatch, so easy to turn on, easy to turn off to something, to solar and wind, which is really expensive to store,really cheap if you use it as it is generated. But grids were designed, in many ways this is the same kind of things that you talk about in your book. Grids were originally designed specifically to provide power that was easily dispatchable, you know, that it was fossil fuels.And that means that the whole of the philosophy of the grid is about something called supply side response. And that is all that is basically saying, "do you know, users, you don't need to worry." Flick of a switch, the electricity will always be there and it's the responsibility of the dev, of the providers of the electricity, of the grids to make sure that the electricity is always there to meet your demand.You never have to think about it. But for renewables it's generally agreed that what we're gonna have to do is move to something called demand side response, where users are incentivized to change their use to match when the sun is shining and the wind is blowing. As you say, when we were kids in the UK, we used to have something called economy seven.You had seven hours a day, which was usually at night. where, because it was all, because back in then, I'm guessing, 'cause it was a coal fired power station. Coal fired power stations were not so easy to turn off and on again, which gas is. So we don't have it anymore. And it's, and, but in those days you say the coal fired power station was running during the night and nobody was using the power.So we wanted to actually get people to try and use the power during the night. And we used effectively what are now called time of use tariffs to incentivize people to use spare power, which was during the night in the UK. Jamie Dobson: It sounds like a huge dislocation to life, but when I first came to London, the London Mayor or the authorities made an announcement that when something like this, "oh, air pollution's really bad today. Don't go out running, close your windows. Old people don't go out, don't do any exercise."And I remember thinking "this can't be real. Is this some sort of prank?" But this is a thing in London. And I remember thinking, but at no point would the Mayor of London say, "okay, the air pollution's bad. You're not allowed to drive your car today," right? And it showed where the priorities lie. But it wasn't that difficult.So everybody just shrugs their shoulders and says, "oh, well, okay, I just won't do any out outdoor activities today." So I think that demand side response is possible. I do wonder what happens though if, let's say, obviously the sun's shining, so that's the time you should run your data centers. What happens when the sun's not shining?Are the cloud providers gonna be happy to have an asset sat there doing nothing when it's dark, for example, or when the wind's not blowing?Anne Currie: Well, it's interesting. I think it depends how much, if, it's all about what is the level of difference in electricity cost between the time when the sun is shining or the wind's blowing, and the time when it isn't. I'm massively impressed by work that India is doing at the moment on this, on time of use tariffs because they have tons of, or and they know what they're looking forward, they already know they're one of the fastest growing. So India is one of the fastest growing countries in the world for rolling out solar power. Unsurprisingly, 'cause it's pretty sunny in India. So they're looking forward and they're thinking, well, hang on a minute.You know, we are gonna have this amazing amount of solar power in the future, but we are going to have to change people's behaviors to make sure that they run on it, not the other thing. So, the way they're doing that is that the strategy that they're adopting for incentivizing people to change their behavior.And as you say, actually people will change behavior. They just need a little bit of a push and some incentives and they will change their behavior. The incentive they're using is time of use tariffs. And India is pushing out all of the province, the states in India to introduce time of use tariffs which reflect the actual cost of electricity and push people towards times of the day when they're, when they'll be. And it's, it is a gradual process, but you can see that it will roll on and on and they're, looking at a tenfold difference that what they're saying is. That the difference should be tenfold between when your electricity is generated from the sun and when it isn't.And a tenfold difference in price does justify a lot of behavioral change. You might as say, you might not want to turn off your, your data center during the night. But some people will go, well, hang on a minute. If it's literally, because for most data centers, the main cost is electricity. If there's a tenfold difference in electricity cost between, the day and the night, then they'll start to adapt and start to do less and start to turn things down.Necessity is the mother of invention. If you don't, if you give a flat tariffs to everybody, they're not gonna make any changes. But if you start to actually incentivize response, demand side response, it will happen.Jamie Dobson: Then of course then that comes back to regulation, doesn't it? Because I think of the things that Edison, well actually it was his colleague, Samuel Insull, realized if you're gonna, it makes no sense to run the grid unless it's some sort of public utility or a natural monopoly. And you can only really fairly run a natural monopoly if the price is a negotiated and set in public and all the industries regulated. So do you think the, that these tariffs, the time of use tariffs, will become part of the regulatory framework of the governments.Anne Currie: Oh, yeah, I mean, it already is. I was saying it's, in India. It's a regulatory thing. It is part of the industrial strategy of India. There are.Jamie Dobson: Then indirectly, then indirectly the cloud providers will be regulated because they'll be regulated through the supply of electricity.Anne Currie: Indeed. Yeah. I mean it's interesting. there's a battle in, so some European countries, it's happening at the moment. I think, Spain already has time of use tariffs. There are other countries that have time of use tariffs and it changes behavior. And in the UK there is a battle at the moment, over, between suppliers about where the time of use tariffs are introduced.So that battle is kind of being spearheaded by the CEO of in the UK it's Octopus Energy. Greg Jackson isn't it, I think is really saying, "look, this is what we need to do." Because, I mean, in the UK it is ridiculous that the government really doesn't want, they fear that everybody will be panicked and not be able to handle a time of use tariff.And, but even though we used to have them not very long ago.Jamie Dobson: It's ridiculous. People always panic about the public sentiment, but you just need to look at COVID, how flexible people can be when they understand the need for it. That's number one. And number two, when I was a kid, and that's only 40 years ago, we used to tend to lights off 'cause it was too expensive.So we did have different behavior in the evening when we needed more electricity than in the daytime when we didn't. It's not that difficult to imagine. You know what? Do you know? What made me laugh is the average serving of meat, I think in the 1970s was 200 grams. And if you look at 200 grams, it's actually quite tiny.It sits on your plate like a little slither of lamb. I was like, "oh my God, that's not enough. That's not enough food." But then you realize that is what we all used to eat, only 30 or 40 years ago. And so we've slowly been sort of, you know, everything's been supersized, including what we expect from the electricity companies. I think a gradual shifting back, you would, you'd barely notice it. And that's exactly how the government took salt out of our diet. That just slowly regulated how much salt could be in processed food until it had gone all together.Anne Currie: Yeah, but I think you have to be careful about how you pitch this. Well, I think one of the issues with green is that it's pitched as, it's a reduction in meat and it's a reduction in, there's a reduction in that. I don't think it only has to be a sad story. It has to be a good story.Something we're, a hill that we're, that we want to take because it's worth taking, not just something that we're, we are running away from that. I like the time of use tariff approach in India because it's saying, if you do this, you'll get electricity, which is a 10th the price, you know that it is something, it's a win.It's not just like run away from the bad thing. It is run towards the good thing. And it with a minor, and you're not saying, "change your behavior because we're ordering you to do it" or because we're going to make electricity much more expensive. Although inevitably, electricity, fossil fuel, electricity will become more expensive because it is naturally more expensive these days.Renewables have become so cheap. Jamie Dobson: Could cloud computing become a forcing function for cheaper electricity? Because the cloud providers need so much electricity, could this possibly accelerate the sort of the raise to green energy? Anne Currie: Well, it definitely can, and it has done in the past. I mean, it in, the early days, the well, so until maybe five years ago or so, the biggest non-governmental purchaser of renewable power in the world was Google. And they were buying renewable power, they were buying and, bankrolling renewable power for their data centers.And they, so they're not the biggest, non-governmental purchaser of renewables anymore because it is now amazon to power their data centers because they got a long way behind and we all made a giant fuss about it and said, well, why aren't your data centers green? And so they put a whole load of money into renewables.A lot of the reason why there's enormous amount of renewables these days and enormous amount of investment has gone into it, was because of the cloud vendors. Now, that is not because the cloud vendors are all secretly social justice warriors. I mean, they did it for their own benefit. But they did do it.Jamie Dobson: That's another pattern that reoccurs is so, at the turn of the last century, so many entrepreneurs were sat on so much money that class unrest was really bubbling. So all of a sudden you got the subway in New York, subway in Paris, the municipal control of transportation, all kinds of stuff.And then you're left thinking, "oh, was, were, they all do-gooders? Was that the reason they did that?"Some of them may have been, but mainly they were trying to avoid class unrest. And so it's interesting that these, a good outcome can come on the back of self-interest, that is true, isn't it?Anne Currie: Yeah, it is true. I it, and it's very hard to know what the unintended consequences, positive and negative of, all behaviors are. So, a lot of investment in early stuff becomes wasted later. So, you, like, you mentioned, subways, railways in the UK and worldwide.Lots of early investment in railways resulted in loads of over provisioning of railways. And then as things got a bit more efficient and everybody goes, well actually you only need one train to go between London and Edinburgh and not 16 different trains on different lines. You get some kind of consolidation down and improvements in efficiency and that's how actually things become cost effective because actually overprovisioning is very cost ineffective.Jamie Dobson: Well, that's true, but that is a very cheeky way to transfer money from rich people to poor people, because obviously what happened is, rich people invested in the railways, railways were over provisioned, those people never got a return. The rest of us were left with cheap railway infrastructure. Exactly the same happened with internet. Everyone's like, right, we gotta wrap the world up in optic fibers. Private companies came in, private investors came in, paid for all of that. Then we had way too many optic fiber cables, and now we've all got practically free internet access. So that occasionally it, it goes either way.Anne Currie: Yeah, and I have to say, I, and I see the same thing with AI. So AI is interesting 'cause on the one hand I rail against how, and AI is unbelievably inefficient at the moment that there's an awful lot of talk about, oh, we'll have to build nuclear because we need it for AI and all that kind of stuff and we'll build all the nuclear and we'll build all the, you know, and hopefully, we'll we need to try and steer people towards doing with nuclear and doing it with solar and wind rather than, rather than fossil fuels. But at the end, it's going to be, there's so much wasted inefficient code in AI. AI is going to need a fraction of the power that we eventually build, we initially build to power the AI. I mean, because at the moment I'm talking to people who are doing measurements and differences between different AI models that do, you know, an equivalent amount of stuff.The ones that are optimized, 10,000 times more efficient, 600,000 times more efficient. I've even heard a million times more efficient. There's so much waste in AI at the moment.Jamie Dobson: Absolutely, and I think people don't, are not focused particularly on theoretical breakthroughs. So Jeffrey Hinton came up with the back, back propagation of errors in neural networks. I think it was about 1983. That's in the book by the way. And that was a breakthrough. That breakthrough, that theoretical breakthrough's got nothing to do with computing power or anything. It's a theoretical breakthrough. Right now we're desperate for something like that. So we're loaded up all these data centers, we're increasing data sets, but ultimately no matter how much compute and data you throw at an artificial neural network, I think it would never fully replace what a human does. So I think it's nice to know that as we lay, you know, we lay down this computing infrastructure and fingers crossed all of its powered by, you know, renewable energy, in the background, researchers will be chipping away at the next theoretical And I think they have to come with artificial intelligence because I think there will be limits to what you can do with generative AI.And I think we're probably reaching them limits right now.Anne Currie: Well, improving AI efficiency does not require massive theoretical breakthroughs. It just, it can be done using the same techniques that we've used for 30 years to improve the efficiency of software. It is just software. I mean, if you look at, DeepSeek, for example, DeepSeek did, have done, I think, so DeepSeek had to make their AI more efficient because the Biden administration said they can't have the fancy chips.So they just went, "oh, we can't have the fancy chips, so we're just gonna make some software changes." And they did it like that, effectively. They're a tiny company and they increased the efficiency tenfold pretty much instantly. And they used three different methods, all of which, well, one of which is probably Max House and it's probably was probably most of the 10 x.The others, there's still so much room for additional efficiency improvement with them. They did, they got rid of over provisioning. They moved from 32 bits of precision to eight bit precision 'cause they didn't need the 32 bit. That was a classic case of over provisioning. So they've removed the over provisioning and that's been known about for years.That's not new. AI engineers have known that 32 bit is, over egging it. And they could run on 8 bit for years. So they didn't do anything new. They didn't have to do any new research. All they had to do was implement something that has, that was well known, but people just couldn't be assed doing. Jamie Dobson: Yeah, all of this noise will soon die down and people behind the scenes away from the attention grabbing headlights will continue to crack on with these things. And so my prediction is that everything's going to, everyone's gonna be pissed off in the next six to 12 months. "AI failed to deliver,"but in the background, use cases will get pieced together.People will find these optimizations, they'll make it cheaper. And I do reckon, ultimately, generative AI will sink into the background just in the same way that nobody really talks about the internet, right? It's the Web or it's mobile phone applications that do something sat on top of the computer network infrastructure. I think that's probably what's gonna happen.Anne Currie: I suspect that generative AI is not going to entirely disappear just because, so I used to work, many years ago, I worked in the, in the fashion industry. I was, I worked for a company that was one of the first in pure play internet e-commerce companies. And because it was fashion, we used a lot of photography.An awful lot of photography, and a lot of it, we had a whole team of editors. So, you know, I can see companies that work with photography, they have, a surprisingly large number of people in the world edit photographs. And so you know that there's a huge, demand for making that easier.The downside is that you then, even now, all photo, all photographs that you see online represent people who do not exist. You know, they, it is like all models you see, it's probably not, that model kind of is kind of based on a person, but.Jamie Dobson: lots of people, isn't it? So I think that generative AI stuff will remain, but I think it will become specific. So for example, I saw yesterday that the government are piecing together a number of different tools that's, let's call that the substrate, but on top of that, it's to give civil servants conversational interface about what was our policies,can you summarize this for me, can you suggest a new policy, which is dangerous because anything, any decision based on past data, it's a reflection of and not necessarily a vision of what could be. So I think that's probably what's gonna happen, but I could be wrong and because the truth is none of us actually know.It's all speculation at this point. Anne Currie: Yeah, so, so before we, well actually we've still got a bit of time, but before we go, I want to focus a little bit on what I see are the themes that run through the creation of the internet and the creation of modern technology in Visionaries, Rebels and Machines, and the Cloud Native Attitude, and Building Green Software.And I think a lot of the themes there are, trying to de deliver your results, the thing that you want, the thing that's gonna improve your life, or the thing that people think is gonna improve their life on fewer resources with fewer resources, because that's the only way it scales. The cloud was all essentially all about how do we deliver our Google's, I mean, it was the cloud was, came outta Google. And it came outta Google, which was the first hyperscaler, and Google was saying, well, actually we really need to deliver our services at incredible scale, but we can't spend the, you know, there's a limit to how much money we can spend on doing it.So we have to do it using operational efficiency and code efficiency so that we deliver this service on fewer resources and also resilience, you know, because things fail at scale and therefore we need to be resilient to failure. But that efficiency and that ability to respond to changing circumstances is exactly what we need to handle the energy transition.Jamie Dobson: So I think the common theme that goes all the way from Thomas Edison to the teams building systems using AI now is that technologies change, but human nature doesn't. So, so the way those teams were managed has been absolutely consistent. I think one of the great contributions of Visionaries and Rebels is to show to people, you don't need to change the way you manage your techies because actually these, this is of success stories that lasted 150 years. Second theme is that once the foundations are laid, it's not the creators of a technology that dictate its destiny, but the users. So once we had a grid, boof, people started inventing applications. Exactly the same once the internet was there, people started inventing web applications. And once the cloud was there, we had Netflix, and then we had Starling Bank and all the things built on top of the substrate. So I think for sure what's gonna come next for sustainable computing will not necessarily be dictated by those building cloud infrastructure. The teams out there, the safe teams, the innovative teams taking risks. I think they will find the use cases. They will dictate what happens next.Anne Currie: Well, so that's interesting 'cause that actually instantly reminds me of the approach that India, which we already talked about, that India are taking where you say, well look, we'll incentivize people to stick a whole load of renewable power into the local grid, into the grid. We've got the grid.The grid just distributes the power and we introduce those incentivizing time of use tariffs, and we say, look, you know, there's really cheap energy at these times. Fill your boots. You decide what you're gonna do with it. And then just leaving the users of the grid, the users of those time of use tariffs to work out what's gonna happen.Jamie Dobson: And I think people will look to India. I think everybody looks at other countries that are doing these experiments. So if it works out in India, then of course you could imagine that other countries might say, "oh, well, that's actually worked out over there. We can copy that as well." But ultimately they're building on existing infrastructure.You know, they say, well, this is what we've got, what, you know, how can we, what does that interface between our users look like? And by making a change there, they will change user behavior somewhere else. Anne Currie: Yeah.Jamie Dobson: It's hard to predict, though. It's hard to predict.Anne Currie: it is hard to predict it. it's kind of, it's an interesting, something that comes up in grid discussions about this, quite often, is this whole kind of idea that, in some ways countries that are less developed than America and the UK are in a much better position for the energy transition because governments can go, we'll have time of use tariffs in every day.We'll, it's not that far. For, you know, the people quite used to microgrids, they're quite used to things being fluctuating. They're not, they haven't got used to everything being available at the flick of a switch and a hundred percent reliable. Reliability, to a certain extent, breeds fragility. It breeds people who've forgotten how to handle change.Jamie Dobson: Yeah. So of course there are places in the world that have got cell phone infrastructure, but they don't have any telecommunications infrastructure, because by the time they came around to installing it, cell phones were a thing, so they just completely skipped. That whole step in technology. We've still got phone boxes in the UK that we, nobody knows what to do with. They're on the street corners, growing moss, and that's a legacy, exactly like what I mentioned earlier, the mud scrapers outside of people's houses. These are a legacy of previous sort of infrastructure. Horses in the case of the scraper and then the telephone boxes in case in the case of cellphones. So I think that's true that india probably has got places that are either off grid or nowhere near as reliable as what we have, for example, in the UK. So then it makes sense that the government can be more experimental because the people are not gonna lose anything. There's nothing to lose.There's only gains. Anne Currie: Indeed. Yes. And in fact, actually, I mean, it is interesting that time of use tariffs being introduced in the UK is now controversial because we have become strategic snowflakes. We can't. We can't, they fear that we can't change, although I think they're wrong. And in fact, time of use tariffs were totally fine 30 years ago.And nobody died as a result of economy seven heating.Jamie Dobson: There's an absolute relationship between the reliability of a system and how spoiled its users has become. So if you, when I first went to the Netherlands, the train would be two minutes late and people would literally slam their feet on the ground in anger, right? And swear in Dutch about the state of the NS. Coming from the UK it's like, "well, whatever."Now, exactly the same happened when, when the video store came along. Most people were used to consuming media as and when, you know, they chose to. But with the video shop, they only had limited editions of new releases. The frustration that created in users of video stores is exactly what led to Netflix's creation. So the more reliable something is, the more complacent, and the higher the expectations its users have of the system. But I think COVID taught the UK government that we could be way more flexible than they fear we are. Anne Currie: Yeah. I agree. And, except actually I don't think they learned that lesson because they immediately forgot it again. Jamie Dobson: Apparently there's loads of lessons they didn't learn. 'Cause apparently we're less ready for a pandemic now than we were before COVID.Anne Currie: Yeah. It is a, it is amazing how many lessons we didn't learn that, but, I think that takes us through a final thing that we should discuss, which I think comes out of what you've just said there about resilience, which is some, and it's something that is a modern thing that we talk about a little bit in the book, in all those three books, which is Chaos Engineering, which is the modern approach to resilience, which is that you get more, ironically, you get more resilient systems by building them on top of systems that you don't expect to be a hundred percent resilient.The expectations of, of a hundred percent availability, supply side response builds, in the end, more fragile systems. Jamie Dobson: The fragility has to go somewhere. So the more resilient the system is, the more fragile the users are. And then the converse of that is true. The more a system fails, the more flexible its users become, and the more workarounds they have because they're not sure if it's gonna be ready. I do know one of the key lessons I took while whilst putting Visionaries and Rebels together could be distilled into one sentence. A system that doesn't fail during its development will fail catastrophically in production. And so what you're left with is electricity grid, the internet, the cloud computing, they're so amazingly, you know, resilient and reliable, they are literally are literally always there. You start to take, you do start to take them for granted. but the paradox is that if you want to create resilient systems, you've got to simulate, stimulate failure in order to learn how to deal with failure, therefore avoid it in the future. It's all a little bit circular really.Anne Currie: Yeah. Yeah. So the irony is that exposing end users to the fluctuation in the availability and price of electricity for renewables, it sounds scary, but it will produce, in the end, a more resilient society. A more resilient system on a countrywide scale.Jamie Dobson: And in your opinion, what's the relationship between this, these type of tariffs and demand side behavior and cloud computing? Where is the link there?Anne Currie: Well, I mean, data centers are users of a grid. They are users that, they are prime users of electricity. If we make a tenfold difference, and I don't think it's gonna, it's gonna affect, it is gonna work for anything less than a tenfold difference in price, we will start to see behavioral change.We will start to see data centers go, "do you know, is there a way that we can, we can reduce the number of machines that are running," because at that point the cost will start. So we need to get it to a point where the cost, the different time of use tariff costs make it worthwhile switching to operations to when the sun is shining the winds blowing.But that is what we have to do, because we need the demand side response behavior. We need the change of response from users. So we have to make it worth their while.Jamie Dobson: You're gonna use economic nudges to make data centers consume green energy, right? So that's the energy side of the equation. What do we do about water supply? So, I don't know if you realize, but lots of data have been refused planning permissionbecause they will drain fresh water from people's houses governments, quite, you know, are not ready to sort of take that on the chin.So what are your thoughts on the water issue?Anne Currie: Well, again, that's, that is a known issue. If we actually, at the moment, if they don't have to do it, they won't do it. So if it will, cooling using water is very cheap and easy. And therefore they do, that's what they do.That is the default. But there are alternatives. I mean, if you look at more modern chips that are, I mean Intel, it's a bit of an old fashioned chip these days, it's very hot. The Nvidia chips are very hot, but there are chips that are coming out that are much more efficient, that're much cooler, that, and that are often designed to be air cooled, not water cooled.So, if we move towards, so it is not unknown, the technology exists for chips that don't get so hot that they require water cooling. The future is chips that can be air cooled. And if they can be air cooled, they're cooled with aircon. And aircon can be fueled by solar power, because obviously, you know, it's when it's hot and it's sunny that you have the biggest problem with heating, it's when it's not sunny and warm,it's less of an issue. So, the future here, the solution is better and more efficient chips hardware that can be air cooled. That that is for most hardware. I think that has to be, that is at least a big part of the solution.Jamie Dobson: Does the future involve huge data centers that fall under government regulation? Because one of the reasons why the electricity grid became a natural monopoly, is 'cause it made no sense to put six sets of cables down. There wouldn't've been enough space in the street and actually the electricity providers couldn't get economies of scale and therefore could not pass on cheap electricity to its users and therefore electricity would never be become widespread.So is there a similar argument for the cloud providers presently?Anne Currie: I have to say I'm a huge believer that we just do it through pricing, that we want data centers to be closed. So in Scotland, and we throw away, we turn off wind, we pay wind farms to turn off. We spend billions and billions of pounds every year to paying wind farms to turn off because there is no user for that power within easy reach of that wind farm.And we're only talking about Scotland. We're not talking about Siberia. Jamie Dobson: I think we could build a data center there.Anne Currie: Why don't we build a data center there?Jamie Dobson: They've got plenty of wind and water.Anne Currie: And an extremely well educated workforce. And it's a bit cooler up there as well, so you don't need to do quite so much cooling anyway. So, but there's no incentive. So while there's no incentive,people won't act. Once there is incentive and a really juicy incentive in place, you know, a 10 x difference in price, we will see behavioral change. Because we do. People, humans are very good at changing their behavior, but only if there's a good reason to do so.Jamie Dobson: Yeah, absolutely. Yeah.Anne Currie: And actually that kind of brings us to the end of our hour.And we, I think we've had a really interesting discussion. I hope the readers of the listeners and potentially in the future readers have enjoyed the discussion. All the links for everything we talked about, all the books, all the comments, will be in the show notes below, so you can go and have a look.And you have to, yeah, actually, you know, you can pre-order Jamie's book, Visionaries, Rebels and Machines on Amazon or any good bookshop, now. You can also buy the Cloud Native Attitude or Building Green Software, which you can also read for free if you have an O'Reilly subscription. And when I get round to it, I'm eventually going to create a commons Building Green Software and I kick me.Everybody should be kicking me all the time to do that because it's just bit work that I need to do. Anyway, so Jamie, thank you so much for being on the podcast. I've really enjoyed our chat. Is there anything final you wanna say before we disappear off?Jamie Dobson: Nothing final for me. The book launch, there'll be a launch party in London at some point. It's available on Kindle, but for now, I'm just happy to get you know, feedback and it's been great to talk to you today, Anne, and I really hope your listeners took something away from this.Anne Currie: So I hope people enjoyed the conversation. It was a bit, a little bit of an author's book club, so a bit different to normal. But I hope you enjoyed it and let us know if you want to hear more of this kind of discussion. Thank you very much and, until we meet again, goodbye from me. Chris Adams: Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation in any browser. Thanks again, and see you in the next episode. Hosted on Acast. See acast.com/privacy for more information.
How to Explain Green Software to Normal People
Host Chris Adams speaks with James Martin about how to communicate the environmental impact of software to a general audience. Drawing on his background in journalism and sustainability communications, James shares strategies for translating complex digital sustainability issues into accessible narratives, explains why AI's growing resource demands require scrutiny, and highlights France’s leadership in frugal AI policy and standards. From impact calculators to debunking greenwashing, this episode unpacks how informed storytelling can drive responsible tech choices.Learn more about our people:Chris Adams: LinkedIn | GitHub | WebsiteJames Martin: LinkedIn | WebsiteFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterNews:Environmental Footprint Calculator | Scaleway [14:19]AI on a diet: how to apply frugal AI standards? - Schneider Electric Blog [26:03] Frugal AI Challenge | Hugging Face [33:33]Greening digital companies: Monitoring emissions and climate commitments Resources:Why Cloud Zombies Are Destroying the Planet | Holly Cummins [14:47]European Sustainability Reporting Standards (ESRS) [21:22]EcoLogits [21:54]Empire of AI - Wikipedia [29:49]Hype, Sustainability, and the Price of the Bigger-is-Better Paradigm in AI | Sasha Luccioni et al. [30:38] Sam Altman (@sama) on X [31:58]Référentiel général d'écoconception de services numériques (RGESN) - 2024 [37:06] Frugal AI If you enjoyed this episode then please either:Follow, rate, and review on Apple PodcastsFollow and rate on SpotifyWatch our videos on The Green Software Foundation YouTube Channel!Connect with us on Twitter, Github and LinkedIn!TRANSCRIPT BELOW:James Martin: When I hear the term AI for Good, which we hear a lot of at the moment, I would say that I would challenge that and I would encourage people to challenge that too by saying "are sure this AI is for good? Are you sure this tech is for good? Are you sure that the good that it does, far outweighs the potential harm that it has?"Because it's not always the case. A lot of the AI for good examples see at the moment are just, they can't be backed with scientific data at all.And that comes back to another of my points. If you can't prove that it's for good, then it's not, and it's probably greenwashing.Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.I'm your host, Chris Adams. Welcome to Environment Variables, where we bring you the latest news and updates from the Board of Sustainable Software Development. I'm your host, Chris Adams. Our guest today is James Martin, a content and communications expert who has spent years translating complex text strategies into compelling narratives that drive change.From leading communications with a special focus on sustainability at Scaleway, to founding BetterTech.blog, James has been at the forefront of making green tech more actionable and accessible. He's spoken at major climate and tech events, most recently ChangeNOW. He's written a comprehensive white paper on green IT, and played a key role in Gen AI Impact a French NGO working to measure the impact of AI. And also he's a Green Software Foundation champion.So, James, thank you so much for joining the podcast. Really lovely to see you again after we last crossed paths in, I guess Paris, I think. Maybe I've tried to introduce you a little bit, but I figure there's maybe some things you might wanna talk about as well. So, can I give you the floor to just like introduce yourself and, talk a little bit about yourself?James Martin: Yeah, thanks very much, Chris. First and foremost, I just wanted to say I'm really happy to, be on this podcast with you because, this podcast is one of the things that really got me excited, and it started me off on my green IT adventure. So, thanks to you and Anne for putting all, putting out all these amazing episodes.Basically what I'm speaking today in the name of BetterTech, which is my blog, which I founded 2018. So I've been a, I've been a journalist for most of my career. And, so for about 15 years I was writing for a French cultural magazine. I had a page in that two weeks. And I started off writing out, "here's a new iPhone, here's a new console." And after that I got a bit bored of just saying the same thing every time. So I was drawn towards more responsible topics, like how do you reduce your screen time, how do you protect your data?And also, of course, what is the impact of technology on the planet? So that started in that, in that magazine, and then I got so into it, i founded my own blog on the topic.And then that was pretty much when an opportunity came up, in 2020, 2021 to work at Scaleway. I thought that sounds really interesting because, that is a European cloud provider, so not American. And also they were already very, communicating a lot about the sustainable aspect of what they do. So, yeah, I was very happy to join them and lead their communications from 2021 with this huge focus sustainability.Chris Adams: Ah.James Martin: Yeah, that's how, that's basically where it started. At that time, Scaleway had its centers and one of them called DC five, which is one of the most sustainable Europe because it doesn't have air conditioning, so it uses a lot less energy. That's it. It has adiabatic cooling. So we focused a lot of communication efforts on that. But then after year or two, Scaleway decided to sell its data centers. I had to look at are the other ways I could talk about sustainability in the cloud? So from digging around into green IT, especially into some green Software Foundation resources,I basically understood that not just data centers, it's hardware and software. So I also, with a bit of help from one of a pivotal meeting, was meeting Neil Fryer from who the Green Software Foundation at a conference. I got him to come and speak at Scaleway to people like me were sort of concerned about the impact of tech. And then that led to the white paper that you mentioned that I erase in 2023, which is basic. It's basically how engineers can reduce the impact of of technology. So, and then that led to speaking opportunities and then to realize that, yeah, I'm not a, I'm not a developer. I'm not an engineer. I may be the first non-developer on this podcast. So I can't build Green Tech, but I can explain how it works and I think that's an important thing to be able to do, if we want to convince as many people as possible of how important this is, then it needs to be communicated properly.And, yeah, so that's what I've been doing ever since.Chris Adams: Okay, thanks. Okay, so I'm, I appreciate that you're coming here as not as a non, as someone who's not like a full-time techie who's like using GitHub on the daily and everything like that, because I think that means you, you get a bit of a chance to like see how normal people see this who aren't conversant in like object storage or block storage or stuff like that.So maybe we can talk a little bit about that then, because when people start to think about, say, the environmental footprint of digital services, right? It's often coming from a very low base. And it's like people might start thinking about like the carbon footprint of their emails, and that's like the thing they should be focusing on first.And like if you do have a bit of domain knowledge, you'll often realize that actually that's probably not where you'd start if you have a kind of more, more developed understanding of the problem. Now you've spent some of your time being this translator between techies and like people who are not full, you know, who, who aren't writing code and building applications all day long, for example.So maybe we could talk a little bit about like the misunderstandings people have when they come to this in the first place and how you might address some of this because this seems to be your day job and this might be something that could, that might help who are other techies realize how they might change the way they talk about this for other people to make a bit more accessible and intelligible.James Martin: Yes. So, thank you for mentioning day job first and foremost, because, so Scaleway was my former day job and I have another day job working for another french scale app. But here I'm very much speaking in the name of my blog. It's because I care so much about topics that I continue to talk about them, to write about them on the side because it's just, I just think something that needs to be done. So this is why today with my BetterTech hatChris Adams: Hat on. Yeah.James Martin: so yeah, just wanted to make that clear. The first thing that people do when people misunderstand stuff, the first thing I want to say is it's not their fault. Sometimes they are led down the wrong path. Like, a few years ago, French environment Minister said people should stop trying to send so many funny email attachments.Chris Adams: Oh, really? James Martin: Like when you send a joking video to all your colleagues, you should stop doing that because it's not good for the planet. It honestly, the minister could say something that misguided because that's not where we, you and I know, not where the impact is. The impact is in the cloud.The impact is in hardware. So it is sort of, about the communication is repetition and I always start with, digital is 4% of global emissions, and 1% of that is data centers, 3% of that is hardware, and software is sort of all over the place. That's the thing I, the figure I use the most to get things started. I think the, there's number one misconception that people need to get their heads around is the people tend to think that tech is, immaterial. It's because of expressions like the cloud. It just sounds,Chris Adams: Like this floaty thing rather than massive industrial concrete things. Yeah.James Martin: Need to make it more physical. If, I can't remember who said that if data centers could fly, then it would make our job easier. But no, that's where you need to always come back to the figures. 4% is double the emissions of planes. And yet, the AIrline industry gets tens of hundreds times more hassle than the tech industry in terms of trying to keep control of their emissions. So what you need is a lot more examples, and you need people to explain this impact over time, so you need to move away from bad examples, like funny email attachments or the thing about, we keep hearing in AI is, one ChatGPT prompt is 10 times more energy than Google. That may or may not be true, but it's a bit, again, it's a bit of the, it's the wrong example, because it doesn't. It doesn't focus on the bigger picture. Chris Adams: Yeah. That kind of implies that if I just like reduce my usage of this, then I'm gonna have like 10 times the impact. I'm gonna, you know, that's all I need to, that feels a bit kind of individual, a bit like individualizing the problem, surely. Right?James Martin: And it's putting it on people's, it's putting the onus on the users, whereas it's not their fault. You need to see the bigger picture. And this is what I've been repeating since I wrote that white paper actually, you can't say you have a green IT approach if you're only focusing on data centers, hardware or software. You've got to focus on, yeah, exactly. Holistically.That said, you should also encourage people to have greener habits because that's, me stopping using ChatGPT just on my own won't have much impact, but it will if I can convince, if I can tell my family, if I can tell my friends, if I can talk about it in podcasts and conferences, then maybe the more people question their usage, then maybe the providers of that tech start providing more frugal examples. ButChris Adams: Ah, I see. So that's like maybe almost like choice architecture, giving people like, you know, foregrounding some of the options. So, you know, making it easier to do, possibly the more sustainable thing, rather than making people at the right, at the end of the process do all the hard work. You, it sounds like you're suggesting that okay, as a professional, part of my role is to kind of put different choices in front of someone who's maybe using my service to make it easier for them to do more sustainable things, rather than like things which are much more environmentally destructive, for example.James Martin: Yes. And I would add a final thing, which is sort of super important because people, there are a topics like electric cars for example, which people get really emotional and angry about, 'cause people are very attached to their cars and yet cars are the number one source of emissions in most Western countries. The way around the emotion is to use, I really focus on only using science-based facts. If it's from the IPCC, if it's from the IEA, if it's like really serious scientific studies then you can use it. If it's just someone speculating on LinkedIn, no. So I always make sure that data I use as fully backed by science, by a sort of by all the GHG protocols, looking at all three scopes all that sort of thing. Because otherwise you just can't, it could be greenwashing.Chris Adams: Okay. Alright, so maybe this is actually a nice segue for the next question because when people talk about, say, well basically in footprint from here, one of the challenges people have is like, like having some numbers, having tr, having some figures for any of this stuff. For example, if I'm using maybe a chat bot, i-, it's very hard for me to understand what the footprint might be. So in the absence of that, you can kind of see how people end up with an idea saying, oh yeah, every query is the same as a, you know, bottle of water, for example. Simply because there is a kind of dearth of information. And this is something that I think that I remember when you presented at Green IO, a conference around kind of green IT, you were talking about how this is actually something that you've had quite a lot of firsthand experience with now, particularly when you're working at Scaleway because there's like new calculators published and stuff like that. I mean, we can talk about the AI thing in a bit more detail later and, but I wanted to ask you a little bit about the impact calculators that I saw you present before.So are there any principles or any kind of approaches that you think are really helpful when you're helping people engage with a topic like this when they're trying to use a calculator to kind of modify or like improve the footprint as like a professional.James Martin: Yes. Well, one of the things that sort of peaked my curiosity when we were looking into the topic at Scaleway is, what percentage of servers or instances are really used?And I was inspired by that, by the work of Holly Cummins from, from Red Hat, who famously said that instances possibly represent around 25% of cloud activity. When I asked around, do cloud providers in general try and identify that, that zombie activity and to just to shut it down, the, from asking around various cloud providers, the consensus I seem to get was, well, no, because people are paying for those instances. So we are just gonna, we are just going to, why would we flag that sort of thing?So that also shows this sort of, the sort of pushback that a, that an environmental calculator might get. Even though, I mean, you could argue that, the fact that there are zombie instances is potentially more the client's fault than the cloud provider's fault. But yeah, the, building a project like that is just to say that you're going up against of habits where people haven't really, if you want more resources, you can have them, even if you've got too many. It's aChris Adams: Yeah, I guess the incentives. James Martin: Yeah, the cloud has been a, pretty much a all you can eat service in general for since it was invented. So going sort of try and get to get people to use it more responsibly can be seen a bit as going against the grain, but the good news is, it was, got lots of really positive feedback from clients about it and, I don't know how it's doing now, but I'm sure it's doing some really useful work.Chris Adams: So you said, so I just wanna check one thing, 'cause we, you, we said, this idea of zombie instances. My, my guess when you say that is, that's basically a running virtual machine or something like that, that's consuming resources, but it doesn't appear to be doing any obviously useful work. Is that what a zombie is in this context?Right. Okay, cool. And, I can kind of see like why you might not want to kind of turn people's stuff off, along that, because if you are, I mean, if you are running a data center, you're kind of incentivized to keep things up and if you're selling stuff, you're kind of incentivized to kind of make sure there's always stuff available.Right. But I do, I, kind of see your point, like if you, if you're not at least making this visible to people, then yeah, how are people able to make kind of maybe any responsible choices about, okay, is this really the right size, for example? And if like a chunk of your revenue is reliant on that, that's probably another reason that you might not wanna do some of that stuff, so. Oh, okay. Alright. So there's like a change of incentives that we may need to think about, but I know that one thing that I have seen people talking about in France a lot is actually not just looking at energy and, yeah, okay, France has quite a clean grid because there's lots of things like low carbon energy, like nuclear and stuff like that, but is there something else to that? Like why, is it just because the energy's clean, there's nothing else to do? Or is there a bigger thing that you need to be aware of if you are building a calculator or making some of these, figures available to people?Is energy the full picture or is there more to it that we should be thinking about?James Martin: No. Exactly. That was the, that was really the real unique point about Scaleway's calculator, is it wasn't just the carbon calculators and so not just energy and emissions, but also the impact of hardware and also the impact of water, how much water is your data center using? And was a really important part of the project. And I remember my colleagues telling me the most challenging part of the project was actually getting the hardware data off the manufacturers. 'Cause they don't necessarily declare it. Nvidia, for example, still gives no lifecycle analysis data on their GPUs. So, it's incredible. But, there it is. So basically, what Scaleway set out to do is the opposite of what AWS does, which is, AWS says, we've bought all this green energy, renewable energy, we've bought enough carbon credits to cover us for the next seven years. Therefore, your cloud is green.Chris Adams: Nothing to do. No changes. Yeah.James Martin: Yeah. Which is completely false because it's ignoring the scope three, which is the biggest share of emissions, the emissions. So all of that is ignored. I worked out from a report a while ago that nearly 65% of the tech sector's emissions are unaccounted for. It's a complete, in the dark. Then if you consider that only 11% of tech impacts our emissions, the rest is hardware,then we're really, what the information that we've got so far is like, it's portion of the real impact. So that was why, it was such a big deal that Scaleway was setting out to, to cover much of the real impact as possible. Becauseonce you have as broad a picture of as possible of that impact, then you can make the right decisions. As you were saying, Chris, the, then you can choose, I'm going to go for data centers in France because as they say, as you, they, because they have this lower carbon intensity, I might try and use this type of product because it uses less energy. I'd say that is a, that is an added value provider can bring that should attract more clients, I'd have thought, with what with, you've got things like CSRD and all sorts of other Chris Adams: Yeah, it's literally written into the standards that you need to declare scope three for cloud and services and data centers now. So if getting that number is easier, then yeah, I can see why that would be helpful actually.James Martin: Absolutely.Chris Adams: All right. We'll share a link to that specific part of the European Sustainability reporting standards. 'Cause it kind of blew my mind when I saw it actually. Like I didn't realize it was really that explicit. And that's something that we have. So you mentioned Nvidia and you mentioned there's a kind of like somewhat known environmental footprint associated with the actual hardware itself. And as I understand it, you mentioned GenAI Impact, which is an organization that's been doing some work to make. Some of these numbers a bit more visible to people when they're using some of that. Maybe, I could just ask you a little bit, and I know as I understand it, is GenAI impact, is it based primarily in France? Is thatJames Martin: Yeah. So the sort of my origin story for that was, it was again, Green IO more hats off to Gael. So that was at Green IO Paris 2023. It ended with a, from, Théo Alves Da Costa, who is the co-president of Data for Good, is ONG, which has this like 6,000 data scientists, engineers who are all putting their skills to for good, basically as volunteers. And so he did it this presentation, which, notably drew on a white paper from Data for Good, which said that we didn't really know that much at the time, but that the impact of inference could be anything from 20 to 200 times more than the impact of training.And he showed it with these bubbles, and you just, and I just looked at it and went, oh my God, this is beyond the, this goes way beyond any level of cloud impact that we've been used to before. So, yeah, that drew me to get interested in, I went to Data for Good's next meeting launched, GenAI Impact, which is the, project which ended up producing Ecologits.ai, which is a super handy calculator for.Chris Adams: this is a tool to give you to like plugs into like if you're using any kind generative AI tools it as I understand it, like, 'cause we looked through it ourselves. Like if you're using maybe some Python code to call ChatGPT or Mistral or something, it will give you some of the numbers as you do it and it'll give you like the hardware, the water usage and stuff like that.It gives you some figures, right?James Martin: Exactly. And the way it does it is, pretty clever so it will mostly measure open source models, easy because you know what their parameters are all the data is open. And it will compare that with closed models. So it will be able to give you an estimation of the impact closed models like ChatGPT so you can use it to say, what is the impact of writing a tweet with, chat g PT versus what is the impact of doing it with llama or whatever? And, because big tech is so opaque, and this is one of my big, bug bears, it means that it gives us a sort ofChris Adams: That's the best you've got to go on for like me. Yeah.James Martin: very educated guess, and which is something that should, people to use frugal, AI. That's the idea.Chris Adams: Okay. So I, this is one thing that I'm always amazed by when I go to France because there seems to be the, field seems to be further along quite a, definitely in Ger than Germany, for example. And like for example, France had the AI Action Summit this year. It's the only country in the world where the kind of government supported frugal AI channel.You've mentioned this a few times and I'm, might give you a bit of space to actually tell people what frugal AI actually is. I mean, maybe we could talk, how does a conversation About AI spec, for example, how does it differ in France compared to maybe somewhere else in the world, like, that you've experienced because I, it does feel different to me, but I'm not quite sure why.And I figure as someone who's in France, you've probably got a better idea about what's different and what's driving that.James Martin: Yeah, it's, it really is a, it is the place to be. So let's say. If you've seen that the Paris just moved ahead of London as the sort of one of the best places for startups to be at the moment. And one of the reason for that is that very strong AI ecosystem. Everyone thinks of Mistral first and foremost, but are lots of others. But yeah, I just wanted to talk first, before I get into that, I wanted why do we need frugal AI? Because, it's not something that people think about on a daily basis, like I was saying before. you can, My wife the other day was, she's a teacher and she was preparing her, she was using ChatGPT to prepare help prepare her lesson. And I was like, no, don't use that. There are lots of, there are lots of other alternative, but to her it's just of course, there and to 800 million people who use every week. They do it because it's free and they do it because works really well. But, what they don't know is that because of tools like, like ChatGPT and we know that ChatGPT is amongst the highest impact of, model. Data center energy consumption is going to triple or maybe even quadruple by the end of the decade. And data center water consumption is going to quadruple by the end of the decade. And there are lots of very serious studies which all, they all came out at the end of last year. Most of them, they all concur that this is, or, all of these, if you put all of their graphs together, they are very, they're very similar and the scariest thing about them, in fact, is that they show that data center energy consumption has been pretty much flat for the past years because whilst cloud usage has been surging something like 500%, the data center operators like Scaleway and lots of other companies have been able to optimize that energy usage and keep it flat. The problem is that AI is, because this has all been based on CPUs, because AI uses GPUs, which use four times more energy and heat up 2.5 times more than CPUs, the curve has gone like this. It's done a complete dog leg.The consumption of GPUs is just on a such a different scale that the tricks to keep it under control before don't work anymore. So we are really in a sort of, we've reached a tipping point. And it is because, partly people are like generating like millions of Ghibli images, starter packs or, I'm simplifying a lot, but my, I'm, what I'm questioning is, how, when you look at that graph, how much of this activity is really useful? How much of it is curing cancer or, or the greatest joke of all, fixing climate change? When it's, happening is it's making it worse. And that this, again, this dog leg is so sharp that we can't build nuclear power quickly enough to fill up this demand. So what's happening is that, coal burning energy generators, or gas, are being kept open so that we can keep, making those images and doing our homework and all that sort of thing. So that is in a nutshell is, is why we need frugal AI. And we need it also because the, it has been built in a way.If you, if you haven't read, book Empire of AI yet, by Karen Hao, it's very strongly recommended, because of the things it explains is that the genesis of OpenAI, at some point they decided, that bigger your model is, the more, basically the more compute power it uses, the better it will be. And they've just been built building on that premise ever since the launch of ChatGPT. Whereas the fact is, the most recent versions of ChatGPT or GPT, actually hallucinate more than the less powerful version. So why do we need to throw all that power at it? When, as we see from talking to people like the amazing Sasha Luccioni, with LLMs for example, you have models that are 30 to 60 times smaller, which can do just a, just as well a job, just as good a job. So these are the sort of conversations that you can have a lot, in France, which is really sort of standing out today as a frugal AI pioneer. The fact that the, over 90% of French electricity is carbon free is, that helps a lot. That's something that Mistral in particular, on a lot, say, we've got clean energy, therefore we are green. Watch out for the AWS effect. But it is a very important point, because all the ChatGPT and other impact that's happening in America. And so I was very happy to see because of, big tech's opacity, Ecologits, which as you mentioned is a Python library, it very quickly became a global reference because that's all we had.Chris Adams: Okay, so when the bar's on the floor, it doesn't need to be very high, right? James Martin: Yeah exactly. It's like, my favorite tweet, I think my favorite tweet of the year so far is Sam Altman. I can even share the link to the tweet because I love slash love it so much. It basically said when all these hundreds of thousands of millions of Ghibli images happened, and he joked that GpUs were melting. He said, he shared this completely ridiculous graph, which said, this is the water impact of one ChatGPT query, is the water impact of one Burger.Chris Adams: Yeah.James Martin: Sam Altman's comment in the tweet was. Anti AI people making up shit about the impact of ChatGPT whilst eating burgers. And I just found it so cynical because A, I'm not anti AI, I'm just, I'm anti waste. And the, so that's the third point. the reason that people have to make shit up is because they don't declare access to any of the numbers. Yeah. if they did, we wouldn't even be having this conversation. We would be able to say, ChatGPT is this, is this, Llama is this. And we'd be able to compare everyone on a, on the same playing field. ButChris Adams: On their merits. Yeah.James Martin: Yeah. So coming back to France. Because I'm wary of going off on a rant. The French government is really, sort of, has been incredible on this topic. So they, around the time of that AI Action Summit, they supported A frugal AI challenge whereby people were encouraged to complete AI tasks across audio, text, and image. And they, you would win the challenge by doing, completing the tasks, whilst using x times less energy than the big LLMs. And so the projects that won, they used 60 times, one of them used 60 times less energy big LLM. Proving that these big LLMs are not necessary.Chris Adams: And it was solving the same task. 'cause I think from memory, there was like, there was a few challenges which were like, you know, combat disinformation online, discover something useful there. The things like, which were, they weren't, they weren't something which was like, you know, these were considered socially useful problems, but people were free to use any kind of approach they were gonna, they were to take. And what, so what you're saying is that okay, you could use an LLM to solve one of them, but what, solve it one way, but there's other ways that they solved it. And some of the winners were quite, you know, 60 times more efficient, essentially 60 time less consumptive.Right.James Martin: Exactly. So, yeah, it's great to have projects like that. The French Government is also has obtained funding for around a dozen frugal AI projects, which are being run by municipalities all over France. So they're using it to optimize energy usage or to detect garbage in the street or that sort of thing. So that's great. The French government also supports the frugal AI guidelines of AFNOR. AFNOR is France's International, sorry, is France's Official Standards Organization, and what they've done is like basically to say for your AI to be frugal, it needs to correspond with these criteria. The first criteria, which I love is, can you prove that this solution cannot be solved by anything else than AI? And it's pretty strict. There are three first steps, but then it goes into a lot of detail about what is or is not frugal AI, and that's such pioneering work it's on track to become EU standard. That's some really some great work there. But I think, for me, one of the best arguments that I use about why should you bother with frugal AI is, very simply, the French Ministry for the Environment has said to startups, if you want to work with us, you have to prove that your AI is frugal first. So,Chris Adams: Oh, okay. So it's like they're creating demand pool then essentially to like, so like, you know, this is how this is your carrot. Your carrot is a fat government contract, but you need to demonstrate that you're actually following these principles in what you do.James Martin: I love that because it shows that doing things frugally can actually be good for your business.Chris Adams: Okay. Alright. So, wow. I think we should definitely make sure we've got some links for a bunch of that stuff. 'Cause I wasn't aware that there were, I know that France in the kind of world of W3C, they have, I can never put, I never, it's the RGESN and I forget I'm not gonna, yeah. I'm not gonna butcher the pronunciation, but it broadly translates to like a general policy for EcoDesign, and I know that's like a standards track for Europe. James Martin: Yes. Chris Adams: If I can find the actual French words, I might try to share it, but, or maybe you might be to help me with that one because my French is not as, is, nowhere good enough to spell it properly. But I'm also aware that France is actually one of the first countries in the world to actually have like a digital sustainability law. There was one in 2020, the REEN, the Oh yeah.James Martin: That's it. That's it. Yeah. I was very focused on AI with all those examples. But yeah, France is the only country which has a Digital Responsibility Act, called REEN, basically says, for example, that any municipality with over 50,000 inhabitants has to publish their digital responsibility strategy, even if it's just, we are going to buy older, we are going to keep our PCs going for longer or, sort of simple stuff like that. They, the, this French law demands that localities, municipalities, only make an effort on these things, but they show that they are making an effort. So Chris Adams: I see.James Martin: in a sort of a great incentive.Chris Adams: Ah, okay. So that I now understand. So the, with the RGESN, as I understand it, that was essentially something like a guide sort of guidelines for France. Ah, so,James Martin: yeah, it's two different things. RGSN, the guidelines for econ conception. so the how to make your website not only more energy efficient, but also more accessible to people of varying abilities. There's also a law that just came into effect here in France to make websites more accessible. So that, it is great to see those two things going hand in hand. They also announced at the AI Action Summit that they were going to invest hundred billion in new data centers for AI by the end of the decade. You win some, you lose some. But maybe better to do that here with lower carbon than in the states, which is generally speaking, 10 times more carbon in the electricity.Chris Adams: Okay. It sounds like there's a lot happening in France. So not only that, are they talking, so there is this whole, not only is this, there's an idea of like frugal AI in digital sobriety, which is this other French term, which when translated in English, always sounds really strange to my ears, but there's actually quite a lot of, for want of a better word, like policy support behind this stuff to actually encourage people to work in this way, basically, huh?James Martin: Absolutely. And again, I would give a, another heads up to Data for Good for that because they were instrumental in that frugal AI challenge along with Sasha Luccioni.Chris Adams: Okay.James Martin: By the way, we'll be, we'll be speaking at Viva Tech. So, Viva Tech is France's biggest tech event. It's actually one of the biggesttech events in Europe. Unfortunately, they had Elon Musk as their keynote last year and the year before. Fortunately they won't this year.Chris Adams: Yeah.James Martin: Sasha is going to be one of their keynotes this year, which is also great, I think it's a good sign.And she will also be speaking on a panel as part of a sustainability summit with Kate Kallot, which is of Amini AI. And I'll be that conversation. So I'm happy these sort of conversations are happening. NotChris Adams: But more mainstream by the sounds of things.James Martin: Not only between, people like you and me who care, and are, who understand all the tech. But it's super important, as I was saying at the beginning, to be having these conversations with as broad an audience as possible, because otherwise nothing's gonna change.Chris Adams: Okay, so we've spoke about, we've gone quite deeply into talking about AI and hardware and water and stuff like that. If we pull back out. So you are, we talk about how people might engage with this topic in the first place.If there's one thing you could change about how people talk about sustainability, particularly in technology, what would you change, James?James Martin: I suppose I'd presume it as, don't believe the hype. And the hype tech is usually, bigger is better. What I would like people to try and really integrate is that bigger isn't always better. As we said before, it is very important to look at the holistic picture of impacts rather than just the individual ones. It's more important to pressure companies to change as you see with that French government example, rather than making users feel guilty because again, it's not their fault. And I just think people, what I try, what I'm trying to do as often as I can, Chris, is just bring people back to that sort of gold standard of green IT, which is only use the right tools for the right needs.This is why this sort of bigger is better thing is just so irritating to me. The way AI is being done right now, it's a classic in tech. It's using a bazooka to swat a fly. It's not necessary. And it's actually, not only is it ridiculous, but it's also very bad the planet. So, if you only need to do this much, you only need a tool that does this much, not this much. And that's one of the reasons that why,when I hear the term AI for Good, which we hear a lot of at the moment, I would say that I would challenge that and I would encourage people to challenge that too by saying, "are sure this AI is for good? Are you sure this tech is for good? Are you sure? That the good, that it does, far outweighs the potential harm that it has?"Because it's not always the case. A lot of the AI for good examples see at the moment, are just. they can't be backed with scientific data at all.And that comes back to another of my points. If you can't prove that it's for good, then it's not, and it's probably greenwashing.Chris Adams: Okay. So show us your receipts then. Basically, yeah.James Martin: Yeah. Chris Adams: Okay. Well thanks for that, James. James. we're just coming up to time now. So if people have found this interesting and they wanted to learn more about either your writing or where you'll be next, where should people be looking? Is there like, maybe, I mean, you mentioned the website for example, is there anywhere else people should be looking to kind of keep up with, like updates from you or anything like that?The website is BetterTech.blog. So yeah, that's the main, that's where you can find a lot more resources about my work on the impact AI and on other things. I also post frequently on LinkedIn about, about this sort of thing, like things like the last one was about frugal prompting.James Martin: That's, my latest discovery. and, yeah, those are the two, main sources. And, I'll work together to make sure that the.Chris Adams: We have all the links for the show notes and everything like that.James Martin: of this, of this episode.Chris Adams: Brilliant. Well, James, thank you so much for giving me the time, and to everyone's listening, for all of this. And I hope you enjoy the rest of the day in what look appears to be sunny Paris behind you.James Martin: It is been, it's been sunnier, but it's fine.Chris Adams: Okay.James Martin: It's still Paris, so grumble. Thanks very much.Chris Adams: Indeed.James Martin: Thanks very much, Chris. It's like I said, it's been a real honor to be on this podcast and I hope we've been able that's useful for people.Chris Adams: Merci beaucoup, James. James Martin: Merci as well, Chris. Chris Adams: Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation in any browser. Thanks again, and see you in the next episode. Hosted on Acast. See acast.com/privacy for more information.
Why You Need Hardware Standards for Green Software
Chris Adams is joined by Zachary Smith and My Truong both members of the Hardware Standards Working Group at the GSF. They dive into the challenges of improving hardware efficiency in data centers, the importance of standardization, and how emerging technologies like two-phase liquid cooling systems can reduce emissions, improve energy reuse, and even support power grid stability. They also discuss grid operation and the potential of software-hardware coordination to drastically cut infrastructure impact. Learn more about our people:Chris Adams: LinkedIn | GitHub | WebsiteZachary Smith: LinkedIn | WebsiteMy Truong: LinkedIn | WebsiteFind out more about the GSF:The Green Software Foundation Website Sign up to the Green Software Foundation NewsletterResources:Hardware Standards Working Group | GSF [06:19]SSIA / Open19 V2 Specification [12:56]Enabling 1 MW IT racks and liquid cooling at OCP EMEA Summit | Google Cloud Blog [19:14] Project Mycelium Wiki | GSF [24:06]Green Software Foundation | Mycelium workshop EcoViser | Weatherford International [43:04]Cooling Environments » Open Compute Project [43:58]Rack & Power » Open Compute Project Sustainability » Open Compute Project 7x24 Exchange [44:58]OpenBMC [45:25]If you enjoyed this episode then please either:Follow, rate, and review on Apple PodcastsFollow and rate on SpotifyWatch our videos on The Green Software Foundation YouTube Channel!Connect with us on Twitter, Github and LinkedIn!TRANSCRIPT BELOW:Zachary Smith: We've successfully made data centers into cloud computing over the past 20 or 25 years, where most people who use and consume data centers never actually see them or touch them. And so it's out of sight, out of mind in terms of the impacts of the latest and greatest hardware or refresh. What happens to a 2-year-old Nvidia server when it goes to die? Does anybody really know Hello, and welcome to Environment Variables,Chris Adams: brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.I'm your host, Chris Adams. Hello and welcome to Environment Variables, the podcast where we explore the latest in sustainable software development. I'm your host, Chris Adams. Since this podcast started in 2022, we've spoken a lot about green software, how to make code more efficient so it consumes fewer resources or runs on a wider range of hardware to avoid needless hardware upgrades, and so on.We've also covered how to deploy services into data centers where energy is the cleanest, or even when energy is the cleanest, by timing compute jobs to coincide with an abundance of clean energy on the grid. However, for many of these interventions to work, they rely on the next layer down from software,the hardware layer, to play along. And for that to work at scale, you really need standards. Earlier this year, the SSIA, the Sustainable and Scalable Infrastructure Alliance, joined the Green Software Foundation. So now there's a hardware standards working group for HSWG within the Green Software Foundation too.Today we're joined by two leaders in the field who are shaping the future of sustainable software. So, oops, sustainable hardware. We've got Zachary Smith formerly of Packet and Equinix, and My Truong from ZutaCore. We'll be discussing hardware efficiency, how it fits into the bigger sustainability picture, the role of the Open19 standard, and the challenges and opportunities of making data centers greener.So let's get started. So, Zachary Smith, you are alphabetically ahead of My Truong, Mr. Truong. So can I give you the floor first to introduce yourself and tell a little bit about yourself for the listeners?Zachary Smith: Sure. Thanks so much, Chris. It's a pleasure being here and getting to work with My on this podcast. As you mentioned, my name's Zachary Smith. I've been an entrepreneur, primarily in cloud computing for, I guess it's about 25 years now. I went to Juilliard. I studied music and ended up figuring that wasn't gonna pay my rent here in New York City and in the early two thousands joined a Linux-based hosting company. That really gave me just this full stack view on having to put together hardware. We had to build our own computers, ran data center space, oftentimes helped build some of the data centers, connect them all with networks, travel all around the world, setting that up for our customers. And so I feel really fortunate because I got to touch kind of all layers of the stack. My career evolved touch further into hardware. It just became a true passion about where we could connect software and hardware together through automation, through accessible interfaces, and other kinds of standardized protocols, and led me to start a company called Packet, where we did that across different architectures, X86 and ARM, which was really coming to the data center in the 2014/15 timeframe. That business Equinix, one of the world's largest data center operators. And at that point we really had a different viewpoint on how we could impact scale, with the sustainability groups within Equinix as one of the largest green power purchasers in the world, and start thinking more fundamentally about how we use hardware within data centers, how data centers could speak more or be accessible to software users which as we'll, unpack in this conversation, are pretty disparate types of users and don't often get to communicate in good ways. So, I've had the pleasure of being at operating companies. I now invest primarily businesses around the use of data centers and technology as well as circular models to improve efficiency and the sustainability of products.Chris Adams: Cool. Thank you Zachary. And, My, can I give you the floor as well to introduce yourself from what looks like your spaceship in California?My Truong: Thanks. Thanks, Chris. Yes. So pleasure being here as well. Yeah, My Truong, I'm the CTO at ZutaCore, a small two-phase liquid cooling organization, very focused on bringing sustainable liquid cooling to the marketplace. Was very fortunate to cross over with Zach at Packet and Equinix and have since taken my journey in a slightly different direction to liquid cooling. Super excited to join here. Come from, unfortunately I'm not a musician by a classical training. I am a double E by training. I'm joining here from California on the west coast of the Bay Area.Chris Adams: Cool. Thank you for that, My. Alright then. So, my name is Chris. If you're new to this podcast, I work in the Green Web Foundation, which is a small Dutch nonprofit focused on an entirely fossil free internet by 2030. And I'm also the co-chair of the policy working group within the Green Software Foundation.Everything that we talk about, we'll do our best to share links to in the show notes. And if there's any particular thing you heard us talking about that you're really interested that isn't in the show notes, please do get in touch with us because we want to help you in your quest to learn more about green software and now green hardware.Alright then looks like you folks are sitting comfortably. Shall we start?Zachary Smith: Let's do it.Chris Adams: All right then. Cool. Okay. To start things off, Zachary, I'll put this one to you first. Can you just give our listeners an overview of what a hardware standards working group actually does and why having standards with like data centers actually helps?I mean, you can assume that our listeners might know that there are web standards that make websites more accessible and easier to run on different devices, so there's a sustainability angle there, but a lot of our listeners might not know that much about data centers and might not know where standards would be helpful.So maybe you can start with maybe a concrete case of where this is actually useful in helping make any kind of change to the sustainability properties of maybe a data center or a facility.Zachary Smith: Yeah. That's great. Well, let me give my viewpoint on hardware standards and why they're so critical. We're really fortunate actually to enjoy a significant amount of standardization in consumer products, I would say. there's working groups, things like the USB Alliance, that have Really provided, just in recent times, for example, standardization, whether that's through market forces or regulation around something like USB C, right, which allowed manufacturers and accessories and cables and consumers to not have extra or throw away good devices because they didn't have the right cable to match the port.Right? And so beyond this interoperability aspect to make these products work better across an intricate supply chain and ecosystem, they also could provide real sustainability benefits in terms of just reuse. Okay. In data centers, amazing thing, being that we can unpack some of the complexities related to the supply chain. These are incredibly complex buildings full of very highly engineered systems that are changing at a relatively rapid pace. But the real issue from my standpoint is, we've successfully made data centers into cloud computing over the past 20 or 25 years, where most people who use and consume data centers never actually see them or touch them. And so it's out of sight, out of mind in terms of the impacts of the latest and greatest hardware or refresh. What happens to a 2-year-old, Nvidia server when it goes to die? Does anybody really know? You kind of know in your home or with your consumer electronics, and you have this real waste problem, so then you have to deal with it.You know not to put lithium ion batteries in the trash, so,you find the place to put them. But you know, when it's the internet and it's so far away, it's a little bit hazy for, I think most people to understand the kind of impact of hardware and the related technology as well as what happens to it. And so that's, I'm gonna say, one of the challengesin the broader sustainability space for data center and cloud computing. One of the opportunities is that maybe different from consumer, we know actually almost exactly where most of this physical infrastructure shows up. Data centers don't move around usually. Um, And so they're usually pretty big. They're usually attached to long-term physical plants, and there's not millions of them. There's thousands of them, but not millions. And so that represents a really interesting opportunity for implementing really interesting, which would seem complex, models. For example, upgrade cycles or parts replacement or upskilling, of hardware. Those things are actually almost more doable logistically in data centers than they are in the broader consumer world because of where they end up. The challenge is that we have this really disparate group of manufacturers that frankly don't always have all the, or aligned incentives, for making things work together. Some of them actually define their value by, "did I put my, logo on the left or did I put my cable on the right?" You have, a business model, which would be the infamous Intel TikTok model, which is now maybe Nvidia. My, what's NVIDIA's version of this?IDK. But its 18 month refresh cycles are really like put out as a pace of innovation, which are, I would say in many ways quite good, but in another way, it requires this giant purchasing cycle to happen and people build highly engineered products around one particular set of technology and then expect the world to upgrade everything around it when you have data centers and the and related physical plant, maybe 90 or 95% of this infrastructure Can, be very consistent. Things like sheet metal and power supplies and cables and so like, I think that's where we started focusing a couple years ago was "how could we create a standard that would allow different parts of the value chain throughout data center hardware, data centers, and related to, benefit from an industry-wide interoperability. And that came to like really fundamental things that take years to go through supply chain, and that's things like power systems, now what My is working on related cooling systems, as well as operating models for that hardware in terms of upgrade or life cycling and recycling. I'm not sure if that helps but, this is why its such a hard problem, but also so important to make a reality.Chris Adams: So if I'm understanding, one of the advantages having the standards here is that you get to decide where you compete and where you cooperate here with the idea being that, okay, we all have a shared goal of reducing the embodied carbon in maybe some of the materials you might use, but people might have their own specialized chips.And by providing some agreed standards for how they work with each other, you're able to use say maybe different kinds of cooling, or different kinds of chips without, okay. I think I know, I think I know more or less where you're going with that then.Zachary Smith: I mean, I would give a couple of very practical examples. Can we make computers that you can pop out the motherboard and have an upgraded CPU, but still useLike the rest of the band. Yeah.the power supplies, et cetera. Is that a possibility? Only with standardization could that work. Some sort of open standard. And standards are a little bit different in hardware.I'm sure My can give you some color, having recently built Open19 V2 standard. It's different than the software, right? Which is relatively, I'm gonna say, quick to create,quick to change.And also different licensing models, but hardware specifications are their own beast and come with some unique challenges.Chris Adams: Cool. Thank you for that, Zach. My, I'm gonna come bring to the next question to you because we did speak a little bit about Open19 and that was one thing that was a big thing with the SSIA. So as I understand it, the Open19 spec, which we referenced, that was one of the big things that the SSIA was a kind of steward of. And as I understand it, there's already an existing different standard that def, that defines like the dimensions of like say a 19 inch rack in a data center.So, need to be the same size and everything like that. But that has nothing to say about the power that goes in and how you cool it or things like that. I assume this is what some of the Open19 spec was concerning itself with. I mean, maybe you could talk a little bit about why you even needed that or if that's what it really looks into and why that's actually relevant now, or why that's more important in, say, halfway through the 2020s, for example.My Truong: Yeah, so Open19, the spec itself originated from a group of folks starting with the LinkedIn or organization at the time. Yuval Bachar put it together along with a few others.As that organization grew, it was inherited by SsIA, which was, became a Linx Foundation project. What we did when we became a Linux Foundation project is rev the spec. the original spec was built around 12 volt power. It had a power envelope that was maybe a little bit lower than what we knew we needed to go to in the industry. And so what we did when we revised the spec was to bring, both 48 volt power, a much higher TDP to it, and brought some consistency the design itself.So, as you were saying earlier, EIATIA has a 19 inch spec that defines like a rail to rail, but no additional dimensions beyond just a rail to rail dimension. And so what we did was we built a full, I'm gonna air quote, a "mechanical API" for software folk. So like, do we consistently deliver something you can create variation inside of that API, but the API itself is very consistent on how you go both mechanically, bring hardware into a location, how you power it up, how do you cool it? For variations of cooling, but have a consistent API for bringing cooling into that IT asset. What it doesn't do is really dive into the rest of the physical infrastructure delivery. And that was very important in building a hardware spec, was that we didn't go over and above what we needed to consistently deliver hardware into a location. And when you do that, what you do is you allow for a tremendous amount of freedom on how you go and bring the rest of the infrastructure to the IT asset.So, in the same way when you build a software spec, you don't really concern yourself about what language you put in behind it, how the rest of that infrastructure, if you have like, a communication bus or is it like a semi API driven with a callback mechanism? You don't really try to think too heavily around that.You build the API and you expect the API to behave correctly. And so what that gave us the freedom to do is when we started bringing 48 volt power, we could then start thinking about the rest of the infrastructure a little bit differently when you bring consistent sets of APIs to cooling and to power. And so when we started thinking about it, we saw this trend line here about like. We knew that we needed to go think about 400 volt power. We saw the EV industry coming. There was a tread line towards 400 volt power delivery. What we did inside of that hardware spec was we left some optionality inside of the spec to go and change the way that we would go do work, right?So we gave some optional parameters the, infrastructure teams to go and change up what they needed to go do so that they could deliver that hardware, that infrastructure a little bit more carefully or correctly for their needs. So, we didn't over specify, in particular areas where, I'll give you a counter example and in other specifications out there you'll see like a very consistent busbar in the back of the infrastructure that delivers power. It's great when you're at aChris Adams: So if I can just stop for you for a second there, My. The busbar, that's the thing you plug a power thing instead of a socket. Is that what you're referring to there?My Truong: Oh, so, good question Chris. So in what you see in some of the Hyperscale rack at a time designs, you'llsee two copper bars sitting in the middle of the rack in the back delivering power. And that looks great for an at scale design pattern, but may not fit the needs of smaller or more nuanced design patterns that are out there. Does that make sense?Chris Adams: Yeah. Yeah. So instead of having a typical, kinda like three-way kind of kettle style plug, the servers just connect directly to this bar to provide the power. That's that's what one of those bars is. Yeah. Gotcha.My Truong: Yep. And so we went a slightly different way on that, where we had a dedicated power connection per device that went into the Open19 spec. And the spec is up, I think it's up still up on our ssia.org, website. And so anybody can go take a look at it and see the, mechanical spec there.It's a little bit different.Chris Adams: Okay. All right. So basically previously there was just a spec said "computers need to be this shape if they're gonna be server computers in rack." And then Open19 was a little bit more about saying, "okay, if you're gonna run all these at scale, then you should probably have some standards about how power goes in and how power goes out."Because if nothing else that allows 'em to be maybe some somewhat more efficient. And there's things like, and there's various considerations like that, that you can take into account. And you spoke about shifting from maybe 48 volts to like 400 volts and that there is efficiency gained by, which we probably don't need to go into too much detail about, when you do things like that because it allows you to use, maybe it allows you to move along more power without so much being wasted, for example.These are some of the things that the standards are looking into and well, in the last 10 years, we've seen a massive, we've seen a shift from data center racks, which use quite a lot of power to some things which use significantly more. So maybe 10 years ago you might had a cloud rack would be between five and 15 kilowatts of power.That's like, tens of homes. And now you we're looking at racks, which might be say, half a megawatt or a megawatt power, which is maybe hundreds if not thousands of homes worth of power. And therefore you need say, refreshed and updated standards. And that's where the V2 thing is moving towards.Right.My Truong: Okay.Chris Adams: Okay, cool. So yeah.Zachary Smith: Just, the hard thing about hardware standards, where the manufacturing supply chain moves slowunless you are end-to-end verticalizer, like some of the hyperscale customers can really verticalize. They build the data center, they build the hardware, lots of the same thing.They can force that. But a broader industry has to rely upon a supply chain. Maybe OEMs, third party data center operators, 'cause they don't build their own data center,they use somebody else's. And so what we accomplish with V2 was allow for this kind of innovation within the envelope and do the, one of our guiding principles was how could we provide the minimal amount of standardization that we would allow for more adoption to occur while still gaining the benefits?Chris Adams: Ah.Zachary Smith: And so that it's a really difficult friction point because your natural reaction is to like, solve the problem. Let's solve the problem as best we can.The that injects so much opinion that it's very hard to get adopted throughout the broader industry. And so even things like cooling,single phase or two phase, full immersion or not, this kind of liquid or this way, different types of pressure, whatever. There's all kinds of implications, whether those are technology use, those are regulatory situations across different environments, so I think like that's the challenge that we've had with hardware standards, is how to make it meaningful while still allowing it to evolve for specific use cases. Chris Adams: Alright. Okay. So, I think I am, I'm understanding a bit now. And like I'll try and put it in context to some of the other podcast episodes we've done. So we've had people come into this podcast from like Google for example, or Microsoft, and they talk about all these cool things that they're entirely vertically designed data centers where they're in the entire supply chain. They do all these cool things with the grid, right? But all those designs, a lot of the time they, there's maybe these might be custom designs in the case of Google when no one gets to see them. Or in some cases, like say Meta or some other ones, it may be open compute, which is a, it's a different size to most people's data centers, for example. So you can't just like drop that stuff in, like there's a few of them arouned, but it's still 19 inches that's the default standard in lots of places. And if I understand it, one of the things that, one of the goals of Open19 is to essentially bring everyone else along who already have standardized on these kind of sides so they can start doing some of the cool grid aware, carbon aware stuff that you see people talking about that you probably don't have that much access to if you're not already meta Google or Facebook with literally R&D budgets in the hundreds of millions.Zachary Smith: Yeah, maybe add some zeros there.Yeah, I think absolutely, right, which is democratizing access to some of this innovation, right? while still relying upon and helping within the broader supply chain. For example, if EVs are moving into 400 volt, like we can slipstream and bring that capability to data centre hardware supply chains.'Cause the people making power supplies or components or cabling are moving in those directions, right? But then it's also just allowing for the innovation, right? Like, I think, we firmly seen this in software. I think this is a great part of Linux Foundation, which is, no one company owns the, you know, monopoly on innovation. And what we really wanna see was not to like, can we make a better piece of hardware, but can we provide, some more foundational capabilities so that hundreds of startups or different types of organizations that might have different ideas or different needs or different goals could innovate around the sustainability aspect of data center hardware and, I think what we're focused on now within GSF is really taking that to a more foundational level. There's just a huge opportunity right now with the data center construction industry happening to really find a even more interesting place where we can take some of those learnings from hardware specifications and apply it to an even broader impact base Chris Adams: Ah, okay. Alright. I'll come back to some of this because there's, I know there's a project called Project Mycelium that Asim Hussain, the executive Director of the Green Software Foundation is continually talking about. But like we've spoken a little about, you mentioned, if I understand it, like, this allows you to maybe have more freedom to talk about maybe, instead of having like tiny fans, which scream at massive, thousands and thousands of RPM, there's other ways that you could maybe call down chips for example. And like, this is one thing that I know that the hardware standards working group is looking at, is finding ways to keep the servers cool, for example. Like as I understand it,using liquid is, can be more efficient, quite a bit more efficient than having tiny fans to cool at massive RPM to cool things down. But also, I guess there's a whole discussion about, well there's different kinds of, there's different ways of cooling things which might reduce the kind of local power draw, local water usage in a data center, for example.And like, maybe this is one thing we could talk a little bit more about then, 'cause I dunno that, we've had people talk about, say, liquid calling and things like that before, as like, these are some alternative ways to more sustainably cool down data centers in terms of how much power they need, but also what their local footprint could actually be.But we've never had people who actually have that much deep expertise in this. So maybe I could put the questions to one of you. Like, let's say you're gonna switch to liquid calling, for example, Instead of using itty bitty fans or even just bigger, slightly bigger fans, running a little bit slower. Like, how does that actually improve it? Maybe you could, maybe I could put this to you, My. 'Cause I think this is one thing that you've spent quite a lot of time looking into, like, yeah, where are the benefits? Like what, how does, how did the benefits materialize if you switch from, say, air to a liquid calling approach like this?My Truong: Yeah, so on the liquid cooling front, there's a number of pieces here. The fans that you were describing earlier, they're moving air, which is effectively a liquid when you're using it in a cooling mode at 25,000 RPM, you're trying to more air across the surface and it doesn't have a great amount of, Zachary Smith: Heat transfer capability. My Truong: removal and rejection. Yeah. heat transfer capabilities. Right. So in this world where we're not moving heat with air, we're moving it with some sort of liquid, either a single phase liquid, like water or a two-phase liquid taking advantage of two phase heat transfer properties.There's a lot of significant gains and those gains really start magnifying here in this AI space that we're in today. And I think this is where Project Mycelium started to come into fruition was to really think about that infrastructure end to end. When you're looking at some of these AI workloads, especially AI training workloads, their ability to go and move hundreds of megawatts of power simultaneously and instantaneously becomes a tricky cooling challenge and infrastructure challenge. And so really what we wanted to be able to think through is how do we go and allow software to signal all the way through into hardware and get hardware to help go and deal with this problem in a way that makes sense.So I'll give you a concrete example. If you're in the single phase space and you are in the 100 megawatt or 200 megawatt data center site, which is, this is what xAI built out Memphis, Tennessee. When you're going and swinging that workload, you are swinging a workload from zero to a hundred percent back to zero quite quickly. In the timescale of around 40 milliseconds or so, you can move a workload from zero to 200 megawatts back down to zero. When you're connected to a grid, when you're connected to a grid,Chris Adams: right.My Truong: that's called a grid distorting power event, right?You can go swing an entire grid 200 megawatt, which is, probably like, maybe like a quarter of the LA area of like the ability to go and distort a grid pretty quickly. When you're an isolated grid like Ercot, this becomes like a very, tricky proposition for the grid to go and manage correctly. On the flip side of that, like once you took the power you, created about 200 megawatt of heat as well. And when you start doing that, you have to really think about what are you doing on your cooling infrastructure. If you're a pump based system, like single phase, that means that you're probably having to spool up and spool down your pump system quite rapidly to go respond to that swing in power demand. But how do you know? How do you prep the system? How do you tell that this is going to happen? And this is where we really need to start thinking about, these software hardware interfaces. Wouldn't it be great if your software workload could start signaling to your software or your hardware infrastructure? "Hey I'm about, to go and start up this workload, and I'm about to go and swing this workload quite quickly." You would really want to go signal to your infrastructure and say, "yes, I'm about to go do this to you," and maybe you want to even signal to your grid, "I'm about to go do this for you" as well. You can start thinking about other, options for managing your power systems correctly, maybe using like a battery system to go and shave off that peak inside of the grid and manage that appropriately. So we can start thinking about this. Once we have this ability to go signal from software to hardware to infrastructure and building that communication path, it becomes an interesting thought exercise that we can realize that this is just a software problem.have been in this hardware, software space, we've seen this before. And is it worth synchronizing this data up? Is it worth signaling this correctly through the infrastructure? This is like the big question that we have with Project Mycelium. Like, it would be amazing for us to be able to do this.Chris Adams: Ah, I see.My Truong: The secondary effects of this is to really go think through, now, if you're in Dublin where you have offshore power and you now have one hour resolution on data that's coming through about the amount of green power that's about to come through, it would be amazing for you to signal up and signal down your infrastructure to say, you should really spool up your workload and maybe run it at 150% for a while, right?This would be a great time to go really take green power off grid and drive your workload on green power for this duration. And then as that power spools off, you can go roll that power need off for a time window. So being able to think about these things that we can create between the software hardware interface is really where I think that we have this opportunity to really make game changing and really economy changing outcomes. Chris Adams: Okay. Zachary Smith: I have a viewpoint on that, Chris, Chris Adams: too.Yeah, please do.Zachary Smith: My TLDR summary is like, infrastructure has gotten much more complicated and the interplay between workload and that physical infrastructure is no longer, "set it in there and just forget it and the fans will blow and the servers will work and nobody will notice the difference in the IT room."These are incredibly complex workloads. Significant amount of our world is interacting with this type of infrastructure through software services. It's just got more complicated, right? And what we haven't really done is provide more efficient and advanced ways to collaborate between that infrastructure and the kind of workload. It's, still working under some paradigms that like, data centers, you put everything in there and the computers just run. And that's just not the case anymore. Right. I think that's what My was illustrating so nicely, is that workload is so different and so dynamic and so complex that we need to step up with some ways to, for the infrastructure and that software workload to communicate.Chris Adams: Ah, I see. Okay. So I'll try and translate some of that for some of the listeners that we've had here. So you said something about, okay. A 200 megawatt like power swing, that's like, that's not that far away from a half a million people appearing on the grid, then disappearing on the grid every 14 milliseconds.And like obviously that's gonna piss off people who have to operate the grid. But that by itself is one thing, and that's also a change from what we had before because typically cloud data centers were known for being good customers because they're really like flat, predictable power draw.And now rather than having like a flat kind of line, you have something more like a kind of seesaw, a saw tooth, like up, down, up, down, up, down, up, down. And like if you just pass that straight through to the grid, that's a really good way to just like totally mess with the grid and do all kinds of damage to the rest of the grid.But what it sounds like you're saying is actually, if you have some degree of control within the data center, you might say, "well, all this crazy spikiness, rather than pulling it from the grid, can I pull it from batteries, for example?" And then I might expose, or that I might expose that familiar flat pattern to the rest of the grid, for example.And that might be a way to make you more popular with grid operators, but also that might be a way to actually make the system more efficient. So that's one of the things you said there. So that's one kind of helpful thing there. But also you said that there is a chance to like dynamically scale up how, when there is loads and loads of green energy, so you end up turning into a bit more of a kind of like better neighbor on the grid essentially.And that can have implications for the fact that because we are moving to a, like you said before, there's complexity at the power level and it allows the data centers to rather than make that worse, that gonna address some of those things. So it's complimentary to the grid, is that what you're saying?My Truong: Yeah. I think you got it, Chris. You got it, Chris. Yeah.Exactly. So that's on the power side. I think that we have this other opportunity now that as we're starting to introduce liquid cooling to the space as well, we're effectively, efficeintly removing heat from the silicon. Especially in Europethis is becoming like a very front and center, conversation of data centers operating in Europe is that this energy doesn't need to go to waste and be evacuated into the atmosphere. We have this tremendous opportunity to go and bring that heat into local municipal heat loopsand really think about that much more, in a much more cohesive way. And so this is again, like where we really, like, as Zach was saying, we really need to think about this a bit comprehensively and really rethink our architectures to some degree with these types of workloads coming through. And so bringing standards around the hardware, the software interface, and then as we start thinking through the rest of the ecosystem, how do we think through bringing consistency to some of this interface so that we can communicate "workload is going up, workload is going down. The city needs x amount of gigawatt of power into a municipal heat loop," like help the entire ecosystem out a little bit better. In the winter, probably Berlin or Frankfurt would be excited to have gigawatts of power in a heat loop to go and drive a carbon free heating footprint inside of the area. But then on the flip side of that, going and building a site that expects that in the winter, but in the summer where you're not able to take that heat off, how do we think about more innovative ways of driving cooling then as well? How do we go and use that heat in a more effective way to drive a cooling infrastructure?Chris Adams: So, okay, so this is that. I'm glad you mentioned the example, 'cause I live in Germany and our biggest driver of fossil fuel use is heating things up when it gets cold. So that's one of the good, good ways to, like, if, there's a way to actually use heat, which doesn't involve burning more fossil fuels, totally. Or I'm all for that. There is actually, one question I might ask actually is like, what are the coolants that people use for this kind of stuff? Because the, when you, I mean, when we move away from air, you're not norm, you're not typically just using water in some, all of these cases, there may be different kinds of chemicals or different kinds of coolants in use, right?I mean, maybe you could talk a little bit about that, because I know that we had switches from when we've looked at how we use coolant elsewhere, there's been different generations of coolants for our, and in Europe, I know one thing we, there's a whole ongoing discussion about saying, "okay, if we're gonna have liquid cooling, can we at least make sure that the liquid, the coolants we're using are actually not the things which end up being massively emitting in their own right," because one of the big drivers of emissions is like end of life refrigerants and things like that. Maybe you could talk a little bit about like what your options are if you're gonna do liquid cooling and like, what's on the table right now?To actually do something which is more efficient, but is also a bit more kind of non-toxic and safe if you're gonna have this inside a, in inside a given space.My Truong: Yeah. So in liquid cooling there's a number of fluids that we can use. the most well understood of the fluids, as used both in the facility and the technical loop side is standard de-ionized water. Just water across the cold plate. There's variations that are used out there with a propylene glycol mix to manage microbial growth. The organization that I'm part of, we use a two-phase approach where we're taking a two-phase fluid, and taking advantage of phase change to remove that heat from the silicon. And in this space, this is where we have a lot of conversations around fluids and fluid safety and how we're thinking about that fluid and end of life usage of that fluid. Once you're removing heat with that fluid and putting it into a network, most of these heat networks are water-based heat networks where you're using some sort of water with a microbial treatment and going through treatment regimes to manage that water quality through the system.So this is a very conventional approach. Overall, there's good and bads to every system. Water is very good at removing heat from systems. But as you start getting towards megawatt scale, the size of plumbing that you're requiring to go remove that heat and bring that fluid through, becomes a real technical challenge.And alsoat megawatts. Yeah. Yeah.Zachary Smith: If I'm not mistaken.Also, there's challenges if you're not doing a two-phase, approach to actually removing heat at a hot enough temperature that you can use it for something else, right?My Truong: Correct. Correct, Zach. It's, so there's, like, a number of like very like technical angles to this. So as you're, going down that path, Zach, so in single phase what we do is we have to move fluid across that surface a good enough clip to make sure that we're removing heat keeping that silicon from overheating. Downside of this is like, as silicon requires colder and colder temperatures to keep them operating well, their opportunity to drive that heat source up high enough to be able to use in a municipal heat loop becomes lower and lower. So let's say, for example, your best in class silicon today asking for what's known as a 65 degree TJ. That's a number that we see in the silicon side. So you're basically saying, "I need my silicon to be 65 degrees Celsius or lower to be able to operate properly." flip side of that is you're gonna ask your infrastructure go deliver water between 12 to 17 degrees Celsius to make sure that, that cooling is supplied. But the flip side of that is that if you allow for, let's say, a 20 degree Celsius rise, your exit temperature on that water is only gonna 20 degrees higher than the 70 degrees inlet, so that water temperature is so lowAnd that's not a very nice shower, basically. Yeah.You're in a lukewarm shower at best.So, we have to do, then we have to tr spend a tremendous amount of energy then bring that heat quality up so that we can use it in a heat network. And two phase approaches, what we're taking advantage of is the physics of two-phase heat transfer, where, during phase change, you have exactly one temperature, which that fluid will phase change.To a gas. Yeah. Yeah.To a gas. Exactly.Yeah.And so the easiest way, like, we'll use the water example, but this is not typically what's used in two phase tech technologies, is that water at a atmospheric pressure will always phase change about a hundred degrees Celsius. It's not 101, it's not 99. It's always a hundred degrees Celsius at hemispheric pressure. So your silicon underneath that will always be at a, around a hundred degree Celsius or maybe a little bit higher depending on what your heat transfer, characteristics look like. And this is the physics that we take advantage of. So when you're doing that, the vapor side this becomes like a very valuable energy source and you can actually do some very creative things with it on two phase.So that's, there's some, every technology has a, is a double-edged sword and we're taking advantage of the physics of heat transfer to effectively and efficiently remove heat in two-phase solutions.Chris Adams: Ah, so I have one kind of question about the actual, how that changes what data center feels like to be inside, because I've been inside data centers and they are not quiet places to be. Like, I couldn't believe just how uncomfortably loud they are. And like, if you're moving away from fans, does that change how they sound, for example?Because if, even if you're outside some buildings, people talk about some of the noise pollution aspects. Does a move to something like this mean that it changes some of it at all?Zachary Smith: Oh yeah.My Truong: In inside of the white space. Absolutely. Like one of the things that we fear the most inside of a data center is dead silence.You might actually be able to end up in a data center where there's dead silence, soon.And that being a good thing. Yeah.With no fans. Yeah. We'd love to remove the parasitic draw of power from fans moving air across data centers, just to allow that power to go back into the workload itself.Chris Adams: So for context, maybe someone, if you haven't been in a data center... I mean, it was around, I think it felt like 80 to 90 decibels for me, which felt like a, I mean, defects have aYeah, plus could have been more actually. Yeah. So I mean, it was a, I mean, if you have like an, if you have a something on a wearable, on a phone, as soon as it's above 90 degrees, 90 decibels, that's likelouder than lots of nightclubs, basically. Like maybe there's a comp. So this is one thing that I fell and this sounds like it does, like it can introduce some changes there as well rather than actually just, we're just talking about energy and water usage. Right.Zachary Smith: Yeah, most data center technicians wear ear protectors all the time, can't talk on the phone, have to scream at each other, because it's so loud. Certainly there's, some really nice quality of life improvements that can happen when you're not blowing that much air around and spinning up multiple thousandMy Truong: 25,000 to 30,000 RPM fans will, require you double hearing protection to be able to even function as out of the space.Yeah, that's the thing.A lot of energy there.Chris Adams: Oh, okay. Cool. So, so this is the, these are some of the, this is some of the shifts that make possible. So the idea, you can have, you might have data centers of what you're able to be more active in terms of actually working with the grid because for all the kind of things we might do as software engineers, there's actually a standard which makes sure that the things that we see Google doing or Meta talking about in academic papers could be more accessible to more people.That's one of the things that having standards and for like Open19 things might be because there's just so many more people using 19 inch racks and things like that. That seems to be one thing. So maybe I could actually ask you folks like. This is one thing that you've been working on and My, you obviously running an organization, Zuta Core here, and Zach, it sounds like you're working on a number of these projects.Are there any particular like open source projects or papers or things with really some of these. Some of the more wacky ideas or more interesting projects that you would point people to? Because when I talk about data centers and things like this, there's a paper from, that's called the Ecoviser paper, which is all about virtualizing power so that you could have power from batteries going to certain workloads and power from the grid going to other workloads.And we've always thought about it as going one way, but it sounds like with things like Project Mycelium, you can go have things going the other way. Like for people who are really into this stuff, are there any, are there any good repos that you would point people to? Or is there a particular paper that you found exciting that you would direct people to who are still with us and still being able to keep up with the kind of, honestly, quite technical discussion we've had here.?Zachary Smith: Well, I would, not to tout My's horn, but, reading the Open19 V2 specification, I think is worthwhile. Some of the challenges we dealt with at a kind of server and rack level, I think are indicative of where the market is and where it's going. There's also great stuff within the OCP Advanced Cooling working group. And I found it very interesting, especially to see some of what's coming from Hyperscale where they are able to move faster through a verticalized integration approach. And then I've just been really interested in following along the power systems, and related from the EV industry, I think there's, that's an exciting area where we can start to see data centers not as buildings for IT, but data centers as energy components.So when you're looking at, whether it's EV or grid scale kind of renewable management, I think there's some really interesting tie-ins that our industry, frankly is not very good at yet.Ah.Most people who are working in data centers are not actually power experts from a generation or storage perspective.And so there's some just educational opportunities there. I've found, just as one resource, My, I don't know if they have it, at the, the seven by 24 conference group, which is the critical infrastructure conference, which everything from like water systems, power systems to data centers, has been really a great learning place for me.But I'm not sure if they have a publication that is useful. We, have some work to do in moving our industry into transparent Git repos.My Truong: Chris, my favorite is actually the open BMC codebase. It provides a tremendous gateway where this used to be a very closed ecosystem, and very hard for us to think about being able to look through a code repo of a redfish API, and able to rev that spec in a way that could be useful and, implementable into an ecosystem has been like my favorite place outside of hardware specifications likeChris Adams: Ah, okay. So I might try and translate that 'cause I, the BMC thing, this is basically the bit of computing, which essentially tells software what's going on inside of data, how much power it's using and stuff like that. Is that what you're referring to? And is Open BMC, like something used to be proprietary, there is now a more open standard so that there's a visibility that wasn't there before.Is that what it is? My Truong: Right. that's exactly right. So there you have to, in years past, had a closed ecosystem on the service controller or the BMC, the baseboard controller dule inside of a server and being able to look into that code base was always very difficult at best and traumatic at worst. But having open BMC reference code out there,being look and see an implementation and port that code base into running systems has been very useful, I think, for the ecosystem to go and get more transparency, as Zach was saying, into API driven interfaces.oh.What I'm seeing is that prevalence of that code base now showing up in a, number of different places and the patterns are being designated into, as Zach was saying, power systems. We're seeing this, become more and more prevalent in power shelves, power control, places where we used to not have access or we used to use programmable logic controllers to drive this. They're now becoming much more software ecosystem driven and opening up a lot form possibilities for us.Chris Adams: Okay. I'm now understanding the whole idea behind Mycelium, like roots reaching down further down into the actual hardware to do things that couldn't be done before that. Okay. This now makes a lot more sense. Yeah.Peel it back. One more layer.Okay. Stacks within Stacks. Brilliant. Okay. This makes sense. Okay folks, well, thank you very much for actually sharing that and diving into those other projects.We'll add some, if we can, we'll add some links to some of those things. 'Cause I think the open BMC, that's one thing that is actually in production in a few places. I know that Oxide Computer use some of this, but there's other providers who also have that as part of their stack now that you can see.Right.My Truong: We also put into production when we were part of the Packet Equinix team. So we have a little bit of experience in running this tech base in, real production workloads.Chris Adams: Oh wow. I might ask you some questions outside this podcast 'cause this is one thing that we always struggle with is finding who's actually exposing any of these numbers for people who are actually further up the stack because it's a real challenge. Alright. Okay, we're coming up to time, so, I just wanna leave one question with you folks, if I may.If people have found this interesting and they want to like, follow what's going on with Zach Smith and My Truong, where do they look? Where do they go? Like, can you just give us some pointers about where we should be following and what we should be linking to in the show notes? 'Cause I think there's quite a lot of stuff we've covered here and I think there's space for a lot more learning actually.Zachary Smith: Well, I can't say I'm using X or related on a constant basis, but I'm on LinkedIn @zsmith, connect with me there. Follow. I post occasionally on working groups and other parts that I'm part of. And I'd encourage, if folks are interested, like we're very early in this hardware working group within the GSF.There's so much opportunity. We need more help. We need more ideas. We need more places to try. And so if you're interested, I'd suggest joining or coming to some of our working group sessions. It's very early and we're open to all kinds of ideas as long as you're willing to, copy a core value from Equinix,as long as you can speak up and then step up, we'd love the help. there's a lot to do.Brilliant, Zach. And My, over to you.My Truong: LinkedIn as well. Love to see people here as part of our working groups, and see what we can move forward here in the industry.Chris Adams: Brilliant. Okay. Well, gentlemen, thank you so much for taking me through this tour all the way down the stack into the depths that we as software developers don't really have that much visibility into. And I hope you have a lovely morning slash day slash afternoon depending on where you are in the world.Alright, cheers fellas.Thanks Chris.Thanks so much. Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation in any browser. Thanks again, and see you in the next episode. Hosted on Acast. See acast.com/privacy for more information.