We’ve moved from a more decentralised internet running on centralised power, to a more centralised internet running on more decentralised power. Is this the only computing model of the future? What would a decentralised internet running on decentralised power look like? We see hints of what this looks like at the edge of the internet, but also the edge of the grid, and this is an area our two guests Dawn Nafus of Intel and Laura Watts of the University of Edinburgh have spent quite a lot of time researching. They join host Chris Adams in this episode of Environment Variables as they explore community clouds, datacentres, energy regulation, projects on the Islands of Orkney and the book that they’re working on together!
We’ve moved from a more decentralised internet running on centralised power, to a more centralised internet running on more decentralised power. Is this the only computing model of the future? What would a decentralised internet running on decentralised power look like? We see hints of what this looks like at the edge of the internet, but also the edge of the grid, and this is an area our two guests Dawn Nafus of Intel and Laura Watts of the University of Edinburgh have spent quite a lot of time researching. They join host Chris Adams in this episode of Environment Variables as they explore community clouds, datacentres, energy regulation, projects on Islands of Orkney and the book that they’re working on together!
Learn more about our guests:
If you enjoyed this episode then please either:
Dawn Nafus: The implication of green software is not just that it's efficient in the immediate savings, but that you're opening the door to this much bigger infrastructure change that is enormously important.
Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.
I'm your host, Chris Adams.
Hello and welcome to Environment Variables. On this episode, I am joined by Dawn Nafus of Intel and Laura Watts of University of Edinburgh, and we'll be discussing community clouds and energy islands. We're shifting from an internet powered predominantly by burning fossil fuels 24 7 in large, centralized generation plants to one powered by a wider mix of decentralized forms of energy, generation.
And yet over the last 10 years, we've seen a shift from a more distributed internet to one where computing is concentrated correspondingly into large centralized Hyperscale data centers running 24 7, much like the centralized power plants of before. So we've moved from a decentralized internet running on centralized power to a more centralized internet running on more decentralized power.
Is this the only computing model of the future though? What would a decentralized internet running on decentralized power look like, though? We see hints of what this looks like at the edge of the internet, but also at the edge of the grid. And this is an area two of our guests have spent quite a lot of time researching to share their insights, and that's what we're gonna dive into today.
But before we dive in, let's do a quick round of intros in alphabetical order. My name's Chris Adams. I am the host of Environment Variables. I am the chair of the policy working group and the executive director of the Green Web foundation. I also help manage the community called climateaction.tech, and I'll hand over to the next person particularly, which I think is you Dawn, Nafus.
Dawn Nafus: Yes, I am Dawn Nafus. I am an anthropologist over at Intel where I focus on AI governance and responsible ai, specifically with an emphasis on AI's role climate change. I'm also an editor with Hannah Knox of for a Data Saturated World, which looks at the surprising ways that ethnography and data science intersection.
So I'll hand it over to Laura.
Laura Watts: Chris, thanks so much for inviting us to be here. This is fantastic. I'm Laura Watts. I'm a consultant and an ethnographer of futures, which basically means that I collaborate with organizations, companies, and communities to explore there innovations and how they, and together we might make the future otherwise.
And I have a background both in tech cuz a long time ago I used to work in the telecoms industry, particularly mobile telecoms. And I'm also a professor of Energy and Society at the University of Edinburgh. And as part of all that work, I've written a book called Energy at the End of the World an Orkney Island Saga, which is published by MIT press.
And that's based on the work that I've been doing for, oh, over a decade in the islands of Orkney, which is islands off the far northeast coast of Scotland. And that's actually also where I live. And I've been talking a bit and working with them on their energy futures, which we'll be hearing more about soon, I think.
Chris Adams: Cool. All right. Okay, so for the uninitiated, we've used some words like the edge of the internet and the edge of the grid. Before we go any further, I just figured it might be worth just putting that out to see what that might mean in this context to basically anyone who's up for answering that.
Dawn Nafus: The cloud is somebody else's computer. It raises interesting questions about what we mean by the edge. So if you think about this common distinction we make right now between, often when we say the word cloud, what we really mean is. The big Hyperscale, large infrastructural sort of entities where you can rent space, compute power, all the rest of it.
But what we're seeing now actually is also some serious growth at, in what my colleagues and computer scientists call the edge, meaning computers that live outside of those large spaces. And there there's a continuum really between the large data center. Large-ish by normal people. Standards that might be at a hospital or a bank, right, all the way through to servers that might actually be in mobile phone towers all the way through to smaller and smaller servers to something smaller that might be in somebody's basement. But all of those things right now we can think about as an edge in computing.
Chris Adams: Okay, cool. That's really helpful. And Laura,you mentioned about the edge of the grid.
Laura Watts: Yes, so that's actually where I live. It's a geography, it's a place. So we talk about the edge of the grid, we talk about it, or at least I talk about it in two ways. The first way is the fact that at the edge of the grid, we're in places where the infrastructure, the electricity infrastructure, the grid is more precarious by then when you have storms coming in, or the cables say between islands, so undersea cables might break because they have a lot of tension on them.
So places at the edge of bridge, you got precarious infrastructure. The lights go out occasionally. But unlike places in central areas, there's not this media panic or kind of social media meltdown. People just shrug and get on with. So what that means is you've got places where energy and electricity are visible, and people know where their energy and electricity comes from because they're either fixing it or they're maintaining it or they're thinking about it.
Imagine if you're living in a very stormy environment. When I mean by storm, it's hard to stand upright because that's the kind of level of energy there is in the air. You tend to think about energy. It's something that is part and parcel of your everyday life. You feel it on your body. So when I think about places at the edge of the grid, I mean it in these sites and locations where energy is part of your everyday thinking.
So of course, therefore you're gonna be thinking about. I've got visible infrastructure. I know where it is, but I'm also gonna think about how to generate it. So in places like Orkney, like I said, off the northeast coast, we've got 22,000 Islanders. Think a lot about energy. Very stormy. Long way away from London, closer to the Arctic Circle.
So there they've been generating huge amounts of wind energy, cuz I talked about storms, but also we have wave and tide power and we also have been doing hydro storage to think about ways to store this enormous amounts of energy we're generating in the islands. So that's one effect of being at the edge. You think about the energy, you generate the energy.
The second aspect of being at the edge is that the actual renewable energy resources. So environmental resources, how it's often talked about is at these geographic locations. So if you look on a map, where would you go for where it's windy? Where would you go for where the tidal resources or the wave resources you are going to the edge of the map.
So there is a correlation between being at the edge of the infrastructure and the grid and where the environmental resources, where the waves are big, where the tides are strong, where the wind is very powerful or where there is a large amount of sunlight. So that's the other aspect of the grid. And those two things are what's really changing the shape of the grid as we go forward.
Cuz you're shifting from this. As you talked about, Chris, we went from this centralized fossil fuel power structure. We've got big power station outside cities, and suddenly we're going, huh? Our grid is now the wrong shape because all our power's being generated a long way away from our cities, right? It's being generated in these wind turbines offshore.
It's been generated by other locations, and that has huge implications because the grid is no longer the right shape because we're having to change it as we go into the future.
Chris Adams: Interestingly, one thing you mentioned all the technology there, actually Laura was for the kind of like green software nerds; that's all energy tech that all that people are putting in data centers right now actually. And if I understand correctly, I think you had something you might want to come in on that actually, cuz I could see you nodding away there just then.
Dawn Nafus: Yes. Yeah, no, I do wanna build on that in the sense that if you think about, there's two things to think about here. One is, if you look at the geography of Hyperscale data centers, you can start to see something of a movement, right? You can see, for example, data centers going in the Pacific Northwest where I live, which is really in part about hydropower.
Some wind to a certain extent. You can start to see things starting to move, but it's also connected to. Other kinds of infrastructure, other kinds of considerations like latency, right? So we're not yet at the edges in the way that Laura describes it in, in her work on energy edges. But the other thing that also comes to mind is often when we're doing edge computing, it's true that energy becomes more about salient consideration, right? So you might actually be on battery and have to do a ton of tricks to get your compute down to something you can actually manage, right? You might be running a camera for whatever reason, and you might wanna actually do the computer vision at the camera and not move a bunch of data that you don't need to be moving around specifically either because there's an energy consideration or there's just, it just takes a ton of resources to be able to do that.
So we are baby stepping into this new world where energy is distributed in ways that are different from what we're used to. But we're in no way there yet, right? We're not in those kinds of places where the wind really is serious, or where in the middle of the Australian desert where the sun is, no joke, right?
So it's too slow for my liking in other words.
Chris Adams: Okay, and if I understand it correctly, some of the work in the Orkney Islands is actually seeing how some of the communities are using some of this technology and seeing how they relate to some of this. Is that the case, Laura? This is what some of your research was were you were doing before with the Orkney Cloud. Is that about right? Or maybe you could come in here at this point there.
Laura Watts: Yeah, though I think that picking up from what Dawn said, and this answer your question, we're in this really exciting and important moment where we've gone from cloud computation being about making infrastructure invisible. So if you're a developer, you don't have to think about it. But as we move to adjust energy transition, we're moving to thinking through, okay, we've gotta get, think about where our power comes from. Then suddenly we've gotta understand that question. Where does our power come from? Where does our energy come from? You know, the cloud can no longer be something that's untethered from the energy infrastructure beneath it. It's part of the protocol stack in some ways. So one thing we've been doing in Orkney, which has been quite fun, is asking this question of, and it goes back to your, what you were saying, Chris, but the data center industry is already thinking about where he gets its power from.
That's absolutely central PUE is everything. So what's happening is this reflection on moving away from just power purchase agreements, which is, as I'm sure your listeners know, how data centers try and often power their data centers from renewable sources as they cut a deal to buy renewable energy or invest in renewable energy.
But what's been going on in Orkney is to, rather than basically handing the problem off to a problem of the market. They've been thinking about different kinds of business models and that's where it's things, it starts getting really exciting because instead of it being about, okay, we're just gonna cut a deal and then some, again, distant unknown geography generating the electricity forest or an offshore or onshore wind farm or through wave or tide power, that's something I'm quite interested in - marine energy. But maybe there's things about where we could have a local community and cut a deal direct with them, and that money goes to a charity in the islands that support in the island community. So you can start thinking about fun ideas like fair trade energy or buying direct from wind farmers, literally.
So all these ideas, which we know that we have, but suddenly they become relevant to the energy and data industry. So some of the things we've been doing at Orkney through various projects, We've run an Orkney Cloud project, which was a collaboration with Mozilla, which was great. We've also had a project called Reflex Orkney, which is a government funder project to demonstrate a flexible energy system, and you'll have to ask me to explain terms.
If I get to geeky, it happens. But what we've been doing is saying, normally you manage a grid, so these power purchase agreements might go to renewable energy generation. And grid management at a grid scale. So you're looking at grid scale batteries or you're looking at something like switching turbines on and off what you might do through a smart grid.
But in Orkney, we're going 22,000 people. We're looking much smaller. So we're thinking about managing things like home batteries or electric vehicle charges or micro wind turbines, much smaller community scale things, hydrogen electrolysis, which we have connected to some of our community wind turbines, managing all these things.
To balance the grid. So it's a different kind of management and it enables the communities in Orkney to think about how they can take control of the grid and also to generate energy renewably. So Orkney generates 120% of its electricity needs from renewables. So it's got this amazing resource. And the first stage is doing that, doing like energy as a service type ideas, or think about community asset management.
But then the question is, what if some of those assets weren't just energy assets like a home battery. Maybe it was a server or maybe it was a data center container sitting on a beach plugged into the community windturbine, and you're managing that as an energy asset. Suddenly things become really interesting. We haven't done it yet, but it's an exciting idea.
Chris Adams: Okay, so as I understand it, there were some work with, say, Microsoft having some submerged data centers in various parts of Ireland. This is adjacent to some of that, perhaps. Is that the case?
Laura Watts: Yeah, what you're talking about is something, again, listeners might have heard about. They might have heard about Orkney, because Microsoft who ran the project, Natic said that Orkney had just become one of the most exciting places in tech. And so it was an underwater data center filled with nitrogen, and it was plugged into the European Marine Energy Center test sites.
So it's on. Planted on grid, running off the green electricity in Orkney, and it was underwater for, I'm not sure how many months, maybe a year. You know, listeners can look up project online. And that's about demonstrating the feasibility of underwater data centers for reliability purposes and obviously cooling.
Cuz data centers get hot. That's why talking about them. Cuz it's all about energy and it's all about cooling in Orkney obviously that partly inspired us, but we were also thinking about you could just simply have a server or you could just again know data centers come in containers. What if we plugged them in to our grids and had a conversation about how we think about data processing and data storage in a much more tighter relationship with renewable energy generation. Because as we know, renewable energy only gets generated when the wind blows. We have to either store it in hydrogen or other forms of storage or we have to change the way and when we process, cuz renewable energy needs to be shifted in time and space.
It needs to be moved on the grid over space and it needs to be moved in time to when you need it. And that's maybe something which you can start linking up with the data processing and storage.
Dawn Nafus: One of the reasons why I wanted to think about this topic with Laura is I had separately started doing some work in carbon aware computing, which I understand you had a whole podcast on, on this very topic. And encourage folks to listen cause it really was wonderful. But if you haven't, the short version is that it's about finding techniques to run your workload when and where the renewable energy is available, right? So when the wind is blowing, where the sun is shining, all the rest of it. We've been exploring in inside our labs how to do this with AI training, which is a good thing to do in a carbon aware way because you can wait, right?
The data scientist might be able to wait an hour, might be able to wait a couple days, cuz these things take sometimes weeks to train up. One of the things we quickly learned is carbon awareness is, yes, it's a scheduling problem, right? So there's some software that folks can build, folks have built about when and how to place your workload and all the rest of it.
But it quickly becomes not just a scheduling problem, right? So all the sudden you start to see, oh geez, the grid in California, in fact does look really different from the grid in Oregon. Looks really different. From the grid in New Mexico. So all of a sudden you start to have a relationship to place that you wouldn't otherwise.
Right? And you start to think in these terms that don't think folks tend to think in reading Laura's work, you can then ask the next question, right? Which is, maybe it's not just about the locality and the grid. There actually might be opportunities. To start to think more deeply about who's benefiting and who's actually running what and where workloads are actually going.
Right? And we can start to make choices about that. So as another example, one of the things that's really been heating up right now on social media, you might imagine is with the recent changes, shall we say to Twitter, there are a lot of folks like myself who have moved over to Mastodon. And on Mastodon, we've been having a rip roaring conversation about what would it take to actually stand up a Mastodon server in a place like Orkney where stuff is in fact community run and where there actually is community benefit to how the energy actually works and how it's organized. And there are a million challenges to that that we can talk about, but that's that next step. Once, once you get beyond scheduling, right, you can start to think about all these other social implications that are far deeper than just, you know, writing some scheduling code.
Chris Adams: So if I understand that correctly, Dawn, you're talking about like once you've solved some of the kinda scheduling problems, I suppose there's a chance to then layer over kind of higher level services, like some of the things you might associate with having to have in a more traditional kind of data center.
Like for example, we were using Twitter and that's considered like one way that people actually use to communicate and coordinate with each other. And you're saying like once you've got that, It's plausible to think maybe there's other ways you could create other kind of, for want of a better word, that I'm borrowing from say, some work by Rachael Coldicutt, like community technology, like provision of other kind of services that you might otherwise be getting from very large companies, like say Google Drive and stuff like that.
That's what you're alluding to there?
Dawn Nafus: Yeah, absolutely. It doesn't have to be that way all the time, right? Certainly when my colleagues are training their ai, they're gonna very much want it on their own servers for very good sound reasons, right? But the world is not all that. And so we can start to think about, okay, where does stuff go? To whom does it go, and what are you actually doing on those servers?
And that's when you can start to think about localities, communities, and ultimately who's benefit.
Laura Watts: I thought it might be really helpful for listeners to understand that how this kind of like connects together and how the community might benefit from A to B. So if we're thinking about doing some kind of data processing on a server or in a data center, basically that requires energy, it requires electricity, right?
It requires energy in terms of cooling. But it also requires electricity in terms of just powering the kits. So that's the first bit. If you're basically then putting a load on the electricity grid, so that's what your data process and does, it generates a load on the grid. That means a local community can sell electricity, either direct to the data center or server, or is gonna sell electricity to that distributed part of the grid.
So the other thing to realize is grids are getting more decentralized in the way they're managed, but by creating a load on the datacentre, By doing your processing on a very particular data sense, you potentially allow a local community with a community owned or locally owned energy generator. As I said, like wind is obvious, but it might be solar, it might be other things, or it might be something that's more complex like a, a flexible managed system to sell electricity and to gain revenue.
Because when a community or an individual sells electricity at a small scale. They generate money from what's called the feed in tariff. So you get money by selling electricity and therefore you can generate profit and revenue from the sale of electricity, and that money can be used to reinvest in local communities to support public services or to support local initiatives.
There's lots of things that can do, so I just wanted to paint it out end to end so people can understand why there's this direct relationship between. Not community benefit in some random cash sense, but real organized, governed. Cuz often these organizations at a community level have very clear governance about what decisions they're gonna make, about what they're gonna invest in.
So these are the things that make it a very powerful potential kind of business model. And Dawn talked about the Mastodon example because that's a really nice level to think about because most of the time, certainly in energy, it's about households in tech, it's often about users. Things are thought about as individuals.
Or it's the Hyperscale, huge data centers, big stuff. But when we start thinking at community at, you know, the fediverse level, we're thinking about groups of people. And that's a really interesting place to think about.
Chris Adams: I see. So what you've prescribed, Laura, there's actually another one of the Green Software Foundation members called the S D I A. They're the Sustainable Digital Infrastructure Alliance. They've got this notion, which they refer to as digital power, which is, if you imagine there being like a feed in tariff for electricity, they basically conceive some of the kind of primitives of computing, say storage or compute or networkers, there's
another kind of building block that they refer to as digital power that you put together to build other kind of applications on top of it. And what it sounds like you described there is if organizations are able to control if they're, instead of just actually like, say, generating their own power, it's plausible that they might be to do something like generate their own kind of digital power if you want, and provide something which is maybe slightly higher value that they could then use as a kind of basis for building services or things to meet some of their own community technology needs.
Laura Watts: It's a really interesting idea, so I'm gonna look into that. Chris, that's awesome. I think one of the things is really interesting, something that Dawn and I have talked about is also the challenge of expertise. Cuz the place where I work in ney has this huge amount of electricity and energy grid expertise.
What it doesn't have so much of is basically expertise in data processing infrastructures, you know, InfoSec. It just doesn't have as much expertise in that because Orkney has a history of North Sea oil and gas, so it's got its long energy history. So a lot of the things you are talking about I think are incredibly exciting, but the piece that's almost underneath that is how do we get the expertise or bring the expertise or bring training and understanding of those things to the right location?
So that's one of the reasons that Dawn and I have had in having this conversation. It's a, okay, so we have all expertise in the energy industry. It's cultures of keeping the lights. And then there's other sets of expertise in tech, which is thinking through issues of carbon aware computing, but is often at a very different game as different kinds of expertise.
So how do we bring these?
Chris Adams: You shared this enticing term, a digital blacksmith with us before, is this for some of the references or something else? Cause I heard that before and I, or just this idea of it. Addressing some of these skill gaps of people who might maybe know their way around energy but not computing or vice versa. We still have this kind of gap to create some of this.
Dawn Nafus: Yeah, Yeah and I think not everybody needs to be a software person necessarily to bring it back to the Macon example. And we might be able in a one off sort of a way, actually find the right people, right? Our networks are pretty big. Your listeners might in fact want to jump in and say, Hey, I'll stand something up over on Orkney.
But then it raises this question of, okay, what about these other corners? Other places where people might have onsite renewable energy of a kind and say, yep, I'm very happy to have some sort of server equipment in here. But what next? And so they're having some sort of abstraction layer or something where in a sense you're abstracting enough away so that folks need to know just enough to get done what they want done.
So you don't wanna abstract away the location in the way that Hyperscale currently does, right? You want some of that visible. But you know enough, we are actually getting into somebody's server in a safe and secure way, and doing that kind of orchestration with the energy that's important, and that's something that is easy and available off the shelf.
Chris Adams: So there's two projects which spring to mind when you talk about it. Actually. One of them is the solar protocol, which I'm getting lots of nods here on the podcast. It's a project which is a kind of collaborative project of various Raspberry Pi's with batteries and solar panels distributed all around the globe with the idea being that the website that has the most sun and the highest amount of charge and a battery will be the website that serves whatever website is actually hosted with the solar protocol. And for a while, that's literally just been one website, but they had a hack day a few months back where they actually started talking about the underlying technology. And basically the underlying technology is a raspberry pi with an Apache server, really reassuringly boring technology that you can run WordPress on.
And they now have an open project to start, hosting new projects on this distributed like a, you can, like super green CDN, I suppose is what you might refer to the solar cdn. That's one, one project they have. There's another thing that I've come across in the uk, which may be of interest. There's a company called Green Cloud, and I've taken advantage of the fact that, yeah, most computers have quite a lot of excess power now, excess computing power.
So the idea would be that if you have a machine with spare computing capacity and you have say like rooftop solar or stuff like that, you can add your computer as a node to a, basically a set of serverless style tools. So the, you have something which feels a lot like the kind of serverless tools that you do have, but you know for sure that you're running it on entirely with green power or basically a kind of mini rooftop, solar powered data center inside someone's home.
So this idea of distributed computing, there's a few interesting examples in the uk, but I would love to hear from more, because that's solar protocol is actually more like American and globally, but these are two ones which I immediately start thinking about when you mentioned actually Dawn.
Dawn Nafus: Yeah, I think they speak to two things. One is, you know when the necessary thing just becomes so evident, right? You, you tend to see it cropping up. Here and there and everywhere. But I think it also speaks to, you mentioned boring servers, . But I think it, it speaks to the need and the importance of doing the boring parts of green software development to making sure that your code is what it needs to be, right. That it's not collecting excess data that you actually don't really need, or that you've architected it in a way that you know makes sense for the infrastructure that you've got, because that infrastructure is changing, right? And so the software that's gonna run really well on a solar protocol or the other infrastructures that you mentioned, right?
Those things have to be developed for and kept in mind as you're building stuff. So the, you know, the implication of green software is not just that, it's efficient in the immediate savings. But that you're opening the door to this much bigger infrastructure change that is enormously important.
Chris Adams: So it's not just efficiency. There's a piece of about resiliency related to that as well.
Dawn Nafus: Absolutely. Absolutely. You're making it possible to be resilient, right?
Chris Adams: Okay. This brings us to a nice question about, okay, what this actually means for a natural developer and how this might be different to developing for, say, the cloud like you might have had before. I know that there is one paper that was released recently. Uh, Oh, God, I totally forgot the name. It's, I think it's, Nolan Bashir.
I'll need to check. It's this notion of an ecovisor. Basically, we have hypervisors, which basically take a resource, like a large computer with a certain amount of hard drives and stuff like that, and split that up into a set of virtualized resources. The concept behind an ecovisor is to do the same thing, but with power.
So rather than just having a steady supply of power that you come in that you just don't really have any real understanding of. It's instead, you have power split into kind of three kinds of forms of power. You'd have grid power where the carbon intensity might change over time. There's a renewable onsite power, which is very low carbon, but it's somewhat variable.
And then there's this notion of like battery power, which is coming from something local that you might be able to design with. And this idea is that if you have an idea of what capacity and what quantity you have of each of these going forward. Then you can design with this in mind to make the best use, reduce the carbon intensity of the software that you're using by either using a certain amount of battery over a certain amount of time, or if there's the cases where you have an abundance of power, you might say.
Just shell out like a set of extra subprocesses to use that kind of resources. That's the only example I've seen so far. But I would love to know if there are any other ones that you folks have come across, cuz that was a really cool idea and I've only found out about it from one of the previous guests, Philipp Wiesner actually, when he, he shared a link like Chris, check out this cool paper. It's awesome.
Laura Watts: It sounds really cool. I think for me what it highlights is this kind of coming together are two very different sets of expertise and development of different systems. So you've got the kind of like the eco advisor concept, and I'd be really interested, I haven't read the paper to see how much that's in conversation with people who are doing grid transmission and things like active network management systems, and flexible management systems. So people are working inside DSOs and DNOs, you know, on grid who are struggling at the moment to figure out these really important questions of how you manage your grid assets cuz you've gotta switch stuff on and off and you've gotta be able to balance it and you've gotta know as much as you possibly can about your wind turbine and how it's operating and about your kind of various different, I'm talk, I talk about it as assets, but essentially load or whether it's batteries and you talk to a really nice categorization there.
So the question for me is like, how do we bring these two together? Because there is enormous expertise that goes back decades that's trying to address some of these issues. And then if you look at, say, the UK government's white paper, it reads on energy grid transmission and the kind of like energy future.
It's talking a lot about data. It was like, we need to think about open data. There's the open data task force that's trying to think about this because all the things that are raised in that paper only work if you can get the data from the asset, you need to know about the battery, somebody else, they need to manage that.
And some of that exists. I mean, we've been trying to do this in Orkney with the Reflex Orkney project. Because we're trying to take the data from the home batteries, the micro wind turbines, all these things have apps. They will have APIs. We can get that data, and then we need to basically be able to manage those things on the grid.
So the first piece is, can you get the data you need? And that's not straightforward or trivial. And the second thing is also regulation. And Dawn, you just talked about really dull things like the dull side of green computation. I wanna talk about regulations. So before your listeners feel like I'm about to give a snoozefest, regulation is one of the biggest challenges to what we're talking about for energy and data and thinking about how we do things like the ecovisor, and that is manage assets because you need to be able to have permission from the regulator to basically switch these things on and off or be able to have any impact on the grid, because keeping the lights on is an absolute commitment. So if you're going to start changing the load, if data's gonna get into this space and thinking, how do we write code for using different amounts of energy sources, that's gonna change the load on the grid. And that starts getting into regulatory issues.
And it seems like a dull thing, but actually it's a really important space to start talking about because we can have huge impacts on what the grid looks like. What does a data electricity grid combined look like in the future? That's a regulatory and governance question as much as it is a technical; how do we shove the data about and change what the software looks like?
Dawn Nafus: Yeah, and just to highlight here, in a sense, data figures at two levels. One is. The data that you need to pull this off at all. Right? The instrumented assets that allow you to put stuff where it needs to go and what granularity that has, how that's governed, right? All the rest of it. But then also there's the data that you're shipping around and a straightforward example of how somebody developing software might develop differently if we did have something like, uh, This eco advisor thing or whatever it is, just in the face of changing energy situations.
One of the things we're starting to learn with our carbon aware AI training project is even just how you develop your training experiment. It makes certain amount of assumptions, right? If you assume that energy is just on and you just run the thing, you build it one way, right? You don't really care too much about pausing, right?
For example, which you might, if your energy is intermittent, right? As another example, you might not care too much about where the data is in relation to the compute, unless you're really pressed for time, in which case you really would. But if you're architecting the training code such that you actually know where the data is in relation to where the best place for compute is, well, You've gotta take this into account in your model, right?
Like how it's gonna train, right? So once you all those options open up, then there are implications for how you're developing machine learning models, right? And those are just the simple ones, right? And we're learning what those are as we go along, but I suspect it'll get more interesting as time goes on.
Chris Adams: That's really interesting what you just said because that's the first time I've heard people talking about fighting data gravity. The idea that everything is up, just being in one gigantic big box out of town data center. Like if you design it differently, then you need and have that, there's a way you can basically design away from that. The gravity issue essentially, right?
Dawn Nafus: Right. Exactly. Yeah. And there might be dynamic choices, right? There might be a time and a place where you look, the data gravity is just so big. It's not worth me messing around with the grids greener over there. Right. It's just not right. You know? You know you're gonna have to do. Training 10 times over , right?
You might want to move stuff wholesale. You might wanna know actually what the networking costs are in between in case you have to shuttle, right? There might be a moment where the grids are running renewable energy in such a way that actually it's worth your while to shuttle. Again, these are things that we're just exploring, but yeah, it, it might not be the same type of training, you know, everywhere for everyone.
Chris Adams: This is really interesting when you talk about this cuz this, this keeps coming back to some work by Lorenzo Kristov. About basically taking the lessons learned about computer science, like Law of Demeter and not having to see every single thing or basically applying lots of the ideas from computer science to grid design.
This sounds like really interesting. I wanna just check, uh, Laura and Dawn, you folks have been working on a book. Okay. You've been thinking about this for a while and I understand that you've been putting together a book or something around this. Is there a chance you might be to talk a little bit about that?
Then maybe we'll wrap up for the last few things cuz there's a few questions about what, what you're listening to and looking at right now. And I'm curious about this book now that sounds like there's some work going into this.
Laura Watts: Yeah, definitely. I think the first thing to say is that all the things we're talking about are the things that we've been thinking through because there is this extraordinary. I wanna say most like it. It's not so much a collision, but there's this entanglement of really important between energy and data.
Particularly important when we go through to this change to renewable energy. It's having an impact on data, and we can think about what the future looks like for this combined energy and data industry. Dawn, you were just talking about ai. One of the things that you and I have also talked about is an author respect for computing as a whole.
You know, data is, So I formally have a physics degree way back when, and that, and it leads me with a permanent understanding of the fact that when you talk about data, it's basically just an energy, it's an energy difference between a one and a zero, right? That's all it is. So, you know, whate, whatever you're storing any data, whenever you are doing any processing, you are always using energy and that has huge implications for whether you decide to make an AI at all, whether you decide to store data at all that we have, terms like data lake and all these kinds of things, right?
That seems to give the implication that data is some kind of inert object or it's just sitting there and of course, no. Every data that you store costs energy, and that has an implication to the environment. We can sit here and we can say that's bad, but the conversation we're having today is exciting because we go, actually, we've got lots of really smart ideas about what we might do about that.
We don't have to just shrug. That's really where the book came from, this sense of we've got this interesting combination of industries. That are coming together because there was a shared issue about the fact we know we need to use energy much more smartly and that requires data and energy together.
And so we've got these kind of closed loop issue. And that's really where we started off having the conversation and it led us through things like you heard at the beginning we're ethnographers, which means that we think about issues like culture because that informs the way you think you can imagine a future and build a future based on your experiences of where you come from.
Well, originally you might think there's a big difference between tech and energy because energy is about keeping the lights on. It's very conservative, it's very risk averse. And you know, the traditional idea of the tech industry, it's move fast and break things. It's alpha, beta, ship that really doesn't work in the energy industry.
And then Dawn, you and I started talking about that. They're actually much close than that. Yeah, I just Dawn, you were talking about data centers in particular, being much closer to the energy.
Dawn Nafus: Yeah. As computing has become more and more an important infrastructure in our everyday life, in the same way that energy is right, things have started to go in that direction, right? We really care if you're running a data center, you really care about keeping uptime, keeping your service level agreements, all the rest of it.
If you're in networking, which is computing, right? Same principle holds, right? You don't want packets dropped and things going down. So things are running a lot more like an infrastructure and not just bits of code here and there. And that's important. So there is a sudden point of convergence and that's also interesting.
So again, what we're trying to do in this book is to really think about, okay, what are the best of both worlds that are really gonna help us get through this energy transition?
Laura Watts: Just an example of things that the data industry knows about data privacy. We've had years of worth of thinking about this. We just talked about ai. We knows about this stuff that's really new to the energy industry. Energy industry's got smart meters, it's got home batteries. It's working with personal data, but there's not a lot of experience of thinking about these issues.
We can start realizing actually there's a lot of benefit to bring them together and sharing the expertise across them. And then in the energy industry, certainly in Europe, there's a commitment to making sure you keep the lights onto vulnerable customers and knowing who the most vulnerable people are and making sure they aren't without electricity.
So that ethical attention. Is something we can bring to potentially the data industry, for example. And that's just the tip of the iceberg of some of the things that you might be able to do. And all the things we talked about in this podcast were these different kind of business models, different ways of thinking about meso levels of scale ways of thinking through energy and data together, and the flexible management of assets and data.
These are all the kind of things we're thinking through and the kinds of scenarios for, for what a future might look like. That isn't where renew. Can run the grid and can run data and are an energy mix. I've talked about lots of different energy sources already, but there's a huge number of different ways you can generate energy around the world and that you know is enormous potential that data can really thrive from.
Chris Adams: Cool. Thank you. Before we go, let's get the name of that book. Before we go so we know what, what to be looking for. Is there a working title or is it Or, or something to look out for?
Laura Watts: We do not have a working title at the This is the most exciting thing because of the fact that the, your listeners are getting literally hot off the press ideas, but what you can do is actually you can get the precursors. So I have a book, which I mentioned, energy at the End of the World, an Orkney Island Saga, which is published by MIT Press, and that is available, it's published in 2019 won various awards.
It's written for a broad audience. It's intended to explain what's been happening so innovatively in Orkney, how and why, and gets into some of these issues. And Dawn, you also have a book out as well.
Dawn Nafus: I do, I have a couple, so there's one. On the relationship between ethnography and data science called ethnography for a Data Saturated World with Hannah Knox. There's also some earlier work on the quantified self movement and self tracking and all things to do with keeping track of both your body and the environment that it lives in.
This will be a new adventure for me, certainly, but I'm looking forward.
Chris Adams: Cool. Thank you, Dawn. So I'm just gonna wrap up now. So I'm gonna ask you with one question. Is there a book or a podcast that you're listening to or reading right now that you'd like to share with people?
Dawn Nafus: I'm certainly hugely inspired by Kim Stanley Robinson's Ministry for the Future, which I think really does a beautiful job of articulating what it might look like to take climate seriously and all of the social repercussions of doing so.
Laura Watts: And I think that for me, one of the podcasts I've been listening to recently is actually about the kind of practicalities of writing. So it's called Script Notes. It's a podcast which is for screenwriters and, and things that are interesting to screenwriters. So you seem might seem quite distant. It's a podcast by John August who's a film script writer and also Craig Mason, who is known for Chernobyl, who was the showrunner and and script writer for Chernobyl.
But the thing that, the reason why I find that so helpful and inspiring as myself as an author and writer is that many of us are actually writers. So all academics are professional writers. Many people are writing reports and words have enormous power. I mean, obviously words take power to transmit them.
Literally they cost energy to transmit, but also the style of writing we choose changes. How effective what we say can be. And I think a lot as an author about choosing the right words, about making sure the words are most effective they can be. Because whenever we're choosing to write, even if you're writing kind of a technical manual, it sounds an odd thing to say, but I think our words can really change the world.
Even when we're writing something quite simple, we, we transmit a lot of knowledge to the things that we say, whether it's here through podcasting, but also the actual written word itself. So I find that as script notes is yeah, really helpful for kind of reminding people for paying attention to the editorial process and thinking about the power of words.
Chris Adams: Wise words to end the podcast with then actually, Laura, thank you. Okay, we've come up to the hour and folks, I've really enjoyed this. So thank you very much for beaming in from the various parts of the world, from the Orkneys to California, and folks, I'll probably see you again on a future Green Software Environment Variables podcast.
Thanks folks. Take care ourselves. Bye.
Dawn Nafus: Thanks for having us on. Bye-bye.
Laura Watts: Appreciate it. Bye.
Chris Adams: Hey everyone. Thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get to your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show. And of course, we'd love to have more listeners.
To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation In any browser. Thanks again and see you in the next episode.