08 Sep 2023

Catch up on stories from the past week (and beyond) at the Slashdot story archive




The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
Well, we’re in a situation right now where many datacenters are being abandoned because they were mandated (through subsidy) to be powered by green energy and a solar/wind only grid can’t supply sufficient density.
Datacenters built on solar/wind promises 10 years ago, due to growing compute density, are less than 25% full and the solar/wind provider is saying “sorry, this is all we got in this region”, you told us a decade ago this was what you wanted, so we’re sticking to it.
A
And in which SF story happened that?
Include the planet please too … sounds interesting.
Plenty of information about this in the Netherlands, New York, Northern Virginia, California.

Yeah let’s build a green data centre powered by nuclear. I’ll provide you your needed cloud storage in 30 years.

Yeah let’s build a green data centre powered by nuclear. I’ll provide you your needed cloud storage in 30 years.
thegarbz, demonstrating his remarkable ability to struggle with planning over the mid to longterm…

thegarbz, demonstrating his remarkable ability to struggle with planning over the mid to longterm…

thegarbz, demonstrating his remarkable ability to struggle with planning over the mid to longterm…
Are you implying that people build datacentres as a long term construction project? That is hilarious. What other failing business ideas do you have? Let me guess you just finished funding and building a typewriter factory.
Datacentre projects are conceived and constructed in 5 year timeframes, not mid-longterm.
What is hilarious is you giving yet another example of your failure to think long-term. Or even slightly mid-term.
Current data centers, or the ones being built today, will just use the current mix of the electricity grid to which they are connected. Changing the ratio in your mix (as in, having more low-carbon sources like nuclear and renewables for when it actually produces electricity) is not something that happens instantly. Just look at Germany, and their failure to decarbonize their grid for the last 3

We reduced carbon emissions from 80% (the rest were nukes) to below 50%.

We reduced carbon emissions from 80% (the rest were nukes) to below 50%.
This is still not a low-carbon grid, so it is not decarbonized… and with the gas they plan to use, it won’t be a decarbonized grid. Also, Germany started that process 30 years ago (and 500 billion spent, but who is counting), for so little gains. Thus the term “failure”.

Dumbass, stupid lying asshole.

Dumbass, stupid lying asshole.
Putting an insult in bold does not make it more impactful. It just means you know HTML tags. Congratz.
This is still not a low-carbon grid, so it is not decarbonized Does not matter.
And you should get a dictionary.
Decarbonized is not the same as decarbonizinig.
We did not completely decarbonize, because: we cant do it quicker.
Stupid dumbass, you make me so damn angry, I’m happy we do not talk in persons.
How the fuck can one be so damn stupid an lying continuously?
WE ARE DECARBONISING. WE ARE WORKD LEADER IN DECARBONIZING.
WE ARE THE _FIRST_ WHO STARTED DOING IT!!!
Yeah and you lie about us, for what damn reason

WE ARE THE _FIRST_ WHO STARTED DOING IT!!!

WE ARE THE _FIRST_ WHO STARTED DOING IT!!!
Mmm, no. Your neighbors, the french, did that in the 70s. They went from 40% coal for electricity to 0.3-0.5%. Their grid IS decarbonized.

If you know how to do it cheaper: tell us. Asshole.

If you know how to do it cheaper: tell us. Asshole.
Build a time machine and do like your neighbors, instead of trying to slow everyone else down. If you can’t do that, extend the life of your nuclear plants, instead of shutting down even those who could have kept providing low carbon electricity for a few more decades. Start building nuclear plants. They will cost less than 500 billion, for better results.

Germany is world leader in CO2 reduction. Get it. Stupid fucking idiot.

Germany is world leader in CO2 reduction. Get it. Stupid fucking idiot.
Germany is fight
Datacenter projects are conceived for the next 25-50 years, I’ve been directly involved in building 2 large ones, the cooling systems and generators have lifecycles well over a decade. The problem is energy density is growing exponentially and providers are refusing (look at the Netherlands) to build out more lines to the facilities.
The Chinese do it in on average 18 months, you can buy modular and micro systems in various sizes, nuclear has been ‘figured out’.

The Chinese do it in on average 18 months, you can buy modular and micro systems in various sizes, nuclear has been ‘figured out’.

The Chinese do it in on average 18 months, you can buy modular and micro systems in various sizes, nuclear has been ‘figured out’.
The Chinese do nothing of the sort. Chinese take just marginally less long than the West to build nuclear plants. Even minor expansion projects take them 5-10 years. Stop talking out of your arse your posts are shitty enough as they are.
Nuclear Reactor Construction Time
It takes around 6 to 8 years to build a nuclear reactor. That’s the average construction time globally, some have been built in just 3 years.
There is no reason the US can’t do this in 5 years on average or faster, the US military built its nuclear power plants starting in the 1960s, it took them no more than 24 months, although the data on these systems is likely classified, it seems that SMRs can be constructed if desired in as short as 6 months.
Various news sources:
IPCCC r

Nuclear Reactor Construction Time
It takes around 6 to 8 years to build a nuclear reactor. That’s the average construction time globally, some have been built in just 3 years.

Nuclear Reactor Construction Time
It takes around 6 to 8 years to build a nuclear reactor. That’s the average construction time globally, some have been built in just 3 years.
Reactor is not a power plant. That average doubles as soon as you exclude expand minor expansion projects. The average time to build a power plant in China is over 10 years, in the west it’s over 15.

China is planning at least 150 new reactors in the next 15 years

China is planning at least 150 new reactors in the next 15 years
I’m planning to be a millionaire tomorrow. Won’t happen either. China hasn’t delivered a single project within the planned window.
I quoted you the GLOBAL AVERAGE is 6-8 years, which is for realized regular nuclear reactors, that includes the West that has taken often 15-30 years to just go through the bureaucratic motions. There is no reason pouring a slab of concrete should take 10 years. Uranium is one of the easiest resources to find on earth.
Trams in Amsterdam are powered by electricity. In 2022, the Netherlands electricity mix [nowtricity.com] (scroll down for historical data) was:
– 47% Gas
– 27% Coal
– 15% Wind
– 6% Nuclear
– the rest is anecdotic
Which resulted in 481g CO2eq/kWh in 2022. One of the worst level of emissions per kWh in Europe (beaten only by Poland and its 598g CO2eq/kWh).
But anyway, the question “can computing clean up its act” is not really about that. Most of the impact of computing comes from the sheer amount of devices we keep producing and consuming: smartphones, smartwatches, smart-security systems, laptops… I don’t have the exact numbers right now, but I seem to remember that for smartphones for instance, 75% of the CO2 impact came from manufacturing and shipping. Keeping a smartphone 4 years instead of the average 1.5 years has actually more positive impact than reducing its power consumption even by 50%. Same for a laptop. Same for a lot of things that we use daily.
So you’re saying, buy Apple, whose devices have the longest lifespan in the industry.

So you’re saying, buy Apple, whose devices have the longest lifespan in the industry.

So you’re saying, buy Apple, whose devices have the longest lifespan in the industry.
Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years. You could argue that their old devices could get sold on the used market, which would be a good thing. But unfortunately most devices (not just Apple’s) rot in drawers when people replace them.
So the thing is not just to use a long-lasting device, but to actually use it for a long time. Which means accepting that some of your friends may have shinier things than you, even though you could afford them too.
In Europe, you can also buy a Fairphone 4 [fairphone.com], which is long-lasting too, and which you can basically repair with a screwdrivers. Plus they try to actually source their materials from sustainable sources when possible. And are the most open about their subcontractors.

Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However,

Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However,
…Apple doesn’t put enough battery in them to cover the current demand of the devices when the battery ages, so they have to decrease the maximum speed of the device to make it not spontaneously shut down — especially when new versions of the OS which put more demand on the hardware are released. And it costs enough to put a new battery in it that most people (who have a contract and get a “free” phone every couple years) will not do so, because the hardware is designed in such a way that it makes it

And it costs enough to put a new battery in it that most people (who have a contract and get a “free” phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.

And it costs enough to put a new battery in it that most people (who have a contract and get a “free” phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.
It costs between $69 and $99 to have Apple replace a battery in an iPhone, and that includes the battery and labor. That is not what I would call expensive. And that is if you didn’t buy AppleCare+ on it, in which case the battery replacement is free.

The cost of a battery replacement is not what is holding people back from keeping phones longer.

And it costs enough to put a new battery in it that most people (who have a contract and get a “free” phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.

And it costs enough to put a new battery in it that most people (who have a contract and get a “free” phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.
I sure hope a good solution comes forward for this soon, because I can’t think of one. I’m not an iPhone user, but I don’t want to go back to leaky cases. I want a case sealed with adhesive that somehow makes battery replacement still achievable.
I believe the genesis of the sealed phone was all the warranty denials that Apple had done in the past. The little white sticker turned pink due to humidity so they told their customers they caused “water damage.” And really, as hardware becomes more reliable hu

Oh, was that not what you were going to say?

Oh, was that not what you were going to say?
Nope, because your argument about batteries in apple devices (iphones I guess?) being too small is just bullshit. People recharge their phone every day (or night actually). A relative bought an iPhone SE in 2017, and it is still working and lasting the day. I am sure if she was watching netflix all day on it, it wouldn’t, but if this is your usage pattern I guess we have nothing else to discuss.

And the phone is outdated when you buy it, so you don’t have to wait for it to become outdated before you want to replace it, you have that urge from day one? Interesting…

And the phone is outdated when you buy it, so you don’t have to wait for it to become outdated before you want to replace it, you have that urge from day one? Interesting…
I have grown out of having the “urge” to have the new shiny thing just because it shiny, since I was a teenager.
Gu

your argument about batteries in apple devices (iphones I guess?) being too small is just bullshit. People recharge their phone every day (or night actually).

your argument about batteries in apple devices (iphones I guess?) being too small is just bullshit. People recharge their phone every day (or night actually).
First of all, that’s not enough. Second, my cheapass Moto G Power will last for days on a charge.

Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years.

Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years.
Four years, according to this [9to5mac.com] article. Four point three, according to this [techrepublic.com]. That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”

You could argue that their old devices could get sold on the used market, which would be a good thing. But unfortunately most devices (not just Apple’s) rot in drawers when people replace them.

You could argue that their old devices could get sold on the used market, which would be a good thing. But unfortunately most devices (not just Apple’s) rot in drawers when people replace them.
Really?? Everybody I know trades their old phone in when they buy a new one. I’d say that the question is, what happens to the old phones that are traded in to Apple (or the Verizon store, or wherever you got the phone.)

That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”

That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”
Of course, if Apple is providing the stats they are unlikely to provide proof via spying in the OS. They are likely basing it on how many are still iCloud-locked, some of which have been shredded already because it couldn’t be re-used at the recycler.
When you trade Apple products back to Apple, I’m pretty sure they shred it to keep it off the secondary market. What little they would earn putting it back out there is more than made up for in new sales.

That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”

Of course, if Apple is providing the stats they are unlikely to provide proof via spying in the OS. They are likely basing it on how many are still iCloud-locked, some of which have been shredded already because it couldn’t be re-used at the recycler.

That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”

That latter article has the astonishing statement “Two out of every three devices ever sold by Apple are still in use.”
Of course, if Apple is providing the stats they are unlikely to provide proof via spying in the OS. They are likely basing it on how many are still iCloud-locked, some of which have been shredded already because it couldn’t be re-used at the recycler.
According to the article, “Asymco mobile analyst Horace Dediu created a formula for device lifespan based on the number of devices sold versus the number of active devices in use.” Not clear where the “number of active devices in use” number comes from.

When you trade Apple products back to Apple, I’m pretty sure they shred it to keep it off the secondary market. What little they would earn putting it back out there is more than made up for in new sales.

When you trade Apple products back to Apple, I’m pretty sure they shred it to keep it off the secondary market. What little they would earn putting it back out there is more than made up for in new sales.
Hard to say without more data. In addition to the profit from actually selling the phones, there’s a good argument that selling low-cost used iPhones to people who can’t afford the up-front cost of new iPhones would be a good way to get people into the Apple

According to the article, “Asymco mobile analyst Horace Dediu created a formula for device lifespan based on the number of devices sold versus the number of active devices in use.” Not clear where the “number of active devices in use” number comes from.

According to the article, “Asymco mobile analyst Horace Dediu created a formula for device lifespan based on the number of devices sold versus the number of active devices in use.” Not clear where the “number of active devices in use” number comes from.
Oops, just looked at the article again, and a little later in the article it says that the “number of active devices in use” figure comes from the Apple quarterly investment conference call. So, you were right, the number comes from Apple, and as you point out, could be a biased number.
However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years
I doubt anyone who has a working computer, replaces it just for fun.
This is a 9 year old Apple device. The other one is 12 year olds, was about to let the battery be replaced, as it could easy run another ten years.
If you have a running laptop, with a magsafe connector, made from metal with an SSD inside: there is no damn reason to ever buy a new on
Climate change deniers started mockingly calling the windmills, so proponents of wind power adopted the term.
It’s a good thing language is static and words never change meaning over time. Meanwhile, people still “turn up” the volume on their TV and “dial” their phones.
Archaic terms aren’t limited to verbs. Even nouns that have verbs in their names stick around long past their original use. Software bugs are no longer actual moths fried by a vacuum tube but we still call program glitches bugs.
You don’t think a thing that uses rotational energy to do work isn’t a metaphor for electrical generation?

But windmill is a noun, and nobody refers to wind turbine rotation as “milling.”

But windmill is a noun, and nobody refers to wind turbine rotation as “milling.”
I’m not a native english speker, but would you really use that term for windmill blade rotation? Isn’t milling the thing that happens with the millstones, inside the housing?

In everyday language, “windmill” does not point to the stuff happening inside (i.e. the mill), but to the shape/figure of the windmill itself. Thus, it is quite acceptable to use the same noun also for these new things with rotating blades, even if they happen to generate electricity (instead of flour from grains) behind the closed d
They must look funny.

…in Amsterdam. Data centers are next.
So you will only get your data from the cloud when the wind blows?
If that’s the case then I think Leon Musk will have no problems spewing from Twitter…err X…err Who Are They Now.
“capable of performing up to 429 quadrillion calculations every second”
So IOW Python and nodeJS could run at the same speed as C++ on a 20 year old pentium!
Seriously, if the goal is to reduce the energy usage of systems then a good place to start is the development languages. Scripting languages are fast to develop with and test but they’re hideously inefficient from a CPU cycle and hence energy usage POV. Since people won’t be willing to give them up perhaps there should be far more emphasis on them being
Energy costs are reflected in total price. Having your team spend 10x longer in their offices, driving to work every day, build environments running etc to craft perfectly functioning C or Rust is more expensive and less carbon efficient than having them bang it out in Python or NodeJS.
Even so, a lot of Python libraries are already optimized and using C, that it makes very little difference if you swap out a properly written library call with actual pure C.
The data center power requirements for even a small compute farm are enormous compared to the office power used by the staff working 9-6.
For something like this system with tens of thousands of cpus plus whatever they using for network and don’t forget ram and some sort of storage systems, there is no reason to compare to the office. The office use is just noise on these scales.
First week of freshman CS class we were taught that algorithms are everything. Hardware is secondary. Their example was running
In most cases, the algorithm runs millions of times per second, having it shave off a little bit of energy at the expense of thousands of man hours is not relevant. Again, energy cost is in the price, it costs $500 to run but $50,000 to hire a person that can make it run for $250, the embedded energy expense in having a person is a lot higher.
Yes, at some scale, if you CAN and NEED to scale something to that extent, it becomes worth it, most datacenters, even large compute facilities are running thousands i
Ok so what you said is pretty much the quick sort vs bubble sort.
If current AI compute was like bubble sort then perhaps there is a yet to be discovered/created AI compute algorithm that doesn’t require millions of runs per second. Quick sort didn’t always exist. It was a big deal at the time. What if someone came up with an Ai compute methodology that required tens of thousands instead of millions of calculations? Or better yet, operated entirely differently and just ran once for a few hours and was do
I think you’re talking about the difference of research vs practice. Yes, there is value in research at all levels, but that doesn’t apply across the board.
For most companies it isn’t valuable enough to port everything from Python to eg. Rust just to save a bit of energy as activist programmers often recommend because at some point, someone will make Python more energy efficient [github.com] and within a few years, all the time and money and energy you spent has been undone by faster compilers, better interpreters and o
Ok, taken that way, I agree. Points well made about practice at a single company rewriting code in another language vs general field research. I’m with you.
Energy costs are reflected in total price.
The take that this data center is environmentally friendly pretty much ignores that it’s directly adding heat to the atmosphere. The warming isn’t coming from electrical cooling adding to the grid load indirectly due to greenhouse gas emissions, but it’s warming, regardless.
Heat added directly to the atmosphere and/or the oceans isn’t a good thing. It’s just better than adding greenhouse gas emissions.
The heat added by human activity, regardless what/how: is completely irrelevant in relation to the CO2 emissions
The heat added by human activity, regardless what/how: is completely irrelevant in relation to the CO2 emissions.
No, it isn’t. Because the threat is environmental heating; CO2 emissions trap heat in our atmosphere. We create heat directly that will be trapped by CO2 emissions. Reducing either heat trapping or heat generation, or both, serve to address the actual threat to some degree.
The two issues are directly related.
Sorry, the heat getting out of a power plant making the environment a little bit hotter: is ZERO – in relation to the amount the sun rays in and is trapped by CO2.
You seriously need to get a clue about dimensions.
Here: look at it, this is an ice cube. Small? Yepp!
Look over there: that is an glacier … big, don’t you think so?
Power plant -> small heat (lots of CO2 though).
Sun -> lots of heat … not the suns fault, CO2s fault.

Having your team spend 10x longer in their offices, driving to work every day, build environments running etc to craft perfectly functioning C or Rust is more expensive and less carbon efficient than having them bang it out in Python or NodeJS.

Having your team spend 10x longer in their offices, driving to work every day, build environments running etc to craft perfectly functioning C or Rust is more expensive and less carbon efficient than having them bang it out in Python or NodeJS.
Unless you’re building it once and running it millions/billions/trillions of times. Oh wait, doesn’t that apply to a lot of code that runs in datacenters?
You obviously have little clue as to current compiler technology, that can provide JIT compilers for a variety of languages targeted to just about any platform, These compilers approach or exceed the performance of native-compiled binaries because they can take advantage of run-time information about the data types and values actually being used by the code rather than trying to infer them (usually poorly) at compile time. The fact that Python does not have one of these yet is more a testament to its develo
“You obviously have little clue as to current compiler technology”
Ah – a genius speaks!
“These compilers approach or exceed the performance of native-compiled binaries because they can take advantage of run-time information about the data types and values actually being used by the code rather than trying to infer them (usually poorly) at compile time”
If it gets it wrong at compile time the program won’t work properly at runtime so you’re talking out of your backside. Unless you can give an example of one ty
The problem is rarely the language. The people using the scripting-style languages are overall less trained on how to write good code. When your algorithm is an order of magnitude more complex than it needs to be, then better compilation will only get you so far.
a) not correct
b) scientific code is usually a loop – a single loop – going over lots of data sequentially – pulled from a file: calling one single library function, or a few: which are implemented in C++
Most of the time there is seriously nothing a “better programmer” would do better or different.
RFLOL – good friggin luck.
The industry might *might* occasionally take some time to consider performance on the hottest of hot paths, at huge players, running massive scale applications. The Google’s and Meta’s of the world might spend some time optimizing things like PHP and node where they can – but nobody is going back running the web C/C++. Rust / Go etc are also not going get out of the systems space – just like C / C++ not flexible or fast enough to work in. – Man hours cost and will continue to cost
Sloppy code written by the cheapest contractors around the globe will beat your energy costs any day. Business goals are usually lower total cost, not lower energy usage.
The cores of HPC codes are still largely written in Fortran and C. Python and other scripting languages are only used for configuration and program control. Not the heavy lifting.
people who use python on a super computer usually know more than you about computing
I could make a long list, but cut it down to one item:
The python “script” is a 10 liner. 3 load a libraries, 3 open files, 3 call functions of those libraries: the python code is not even showing up on the performance profiler

Most of the more interesting and less flattering questions pertain to just what we do with all the compute time we use.

Most of the more interesting and less flattering questions pertain to just what we do with all the compute time we use.
Watch AI-generated cat videos seems to be the direction we are heading to. And porn.
Going forward no data centres or supercomputers should use any non-renewable energy. They should be obliged to add enough renewable energy to cover their needs. Many do anyway, but it should be a requirement.

Many do anyway, but it should be a requirement.

Many do anyway, but it should be a requirement.
Data centers who “do anyway” as you say, were just buying carbon offsets (while it was trendy) to appear green.

Going forward no data centres or supercomputers should use any non-renewable energy. They should be obliged to add enough renewable energy to cover their needs.

Going forward no data centres or supercomputers should use any non-renewable energy. They should be obliged to add enough renewable energy to cover their needs.
Just no. Data centers don’t use renewables; they draw electricity from the grid. The distinction is subtle but significant. What would truly be beneficial is ensuring that no data centers or supercomputers are deployed in countries where the electricity grid produces CO2 emissions exceeding 50g CO2eq/kWh (arbitrary number here, but you get the point). This approach would effectively compel countrie
I’d have them add enough capacity to cover all their needs all the time. That will likely include some battery storage, although they have that anyway for UPS.
Then most of the time they will be adding renewable energy to the grid.

Then most of the time they will be adding renewable energy to the grid.

Then most of the time they will be adding renewable energy to the grid.
Re-read the last sentence of my last post: “You are conflating the objective (reducing CO2 emissions) with the means (deploying renewables and other low-carbon energy sources, such as nuclear or hydro).”
This is because your own personal objective/mission is to deploy renewables. Even if that doesn’t mean less CO2 emissions. Instead, your objective should be to have low CO2 emissions, and thus a low-emitting grid. Build new datacenters only in countries with low-carbon grids (there are quite a few already, a
I’d rather see investment go into countries that don’t already have low carbon grids. The more renewable energy going into the grid, the less CO2 emitting generation there will be. That’s the case because renewable energy is cheaper, and because in most places fossil fuel plants have to give priority to renewables and cheaper source when possible.

The more renewable energy going into the grid, the less CO2 emitting generation there will be.

The more renewable energy going into the grid, the less CO2 emitting generation there will be.
No. In that case, if you just add renewables to cover the needs of your new datacenter (which is what you are proposing), then you just emit as much as before for everything else… Percentage-wise, the CO2 emitting generation will be less, because you added renewables for your new needs, but the amount of CO2 emitted wil still be the same (and this is what matters; % of renewables in a grid do not matter).
Actually, the CO2 emissions will likely go up even if you just build renewables for your new datacente
Read it again carefully. Add as much as needed to cover ALL the datacentre’s needs ALL the time. So that would mean enough to cover the datacentre when wind and solar are at their minimums, with battery storage allowed to cover short term dips.
In other words they will be required to massively over-build capacity, and over a wide geographic area. Coverage will be determined based on independent assessment of the proposal. Most of the time the build renewables will be generating far in excess of what the data

In other words they will be required to massively over-build capacity, and over a wide geographic area.

In other words they will be required to massively over-build capacity, and over a wide geographic area.
Do you realize how stupid your proposition is? You basically mean to tell the datacenters: “hey, even though we keep telling everyone that renewables are so cheap that a decarbonized grid is right around the corner (well, for the last 30 years, must be a big corner I guess), can you please build it for us? Err, I mean, for you, but we mainly for us too. And you’ll be paying for it too”.
This is just so out of touch with reality, can’t argue more with you for today.
Who is “we”? Not me or anyone I know. Perhaps you have a reference for that one.
I know why it’s the way it is, but I always wondered about the wisdom of putting your datacentre in Silicon Valley (ie. an almost year-round hot place). Here is an example of someone putting it deliberately where it’s cold(er).
Then there’s the possibly competing issue of putting it somewhere you can use the waste heat – domestic or even industrial units near by are required. People tend not to live in year-round cold places though. You also have to wonder if the climate likes its cold bits getting warmed all year too.
Lastly, putting it somewhere that you can’t get at sensibly because it’s too remote isn’t useful, and of course you have to think about the network connections to such a place too.
FWIW, there’s a (crappy looking) industrial unit about 10 minutes drive from my house that has an 8MW electrical connection (and was previously, I assume a datacentre, although not anything on the scale of the one in TFS). I’ll bet that none of its waste heat was used to even warm the offices on site, let alone the other industrial units that surround it, and definitely not the houses that surround all of it. Again, I can see the reasons why not – but that’s sort of the problem – we need to find a way to make doing these sorts of things the obvious and easy solution, rather than the difficult and expensive one.
If we had a decent internet with good uplink speeds then cloudy data centers could be ludicrously distributed, i.e. it would actually make sense to use their elements for residential heating and the like.
Unfortunately we don’t, which is why we have data centers.
It’s not as simple as “uplink speeds.” These are spots that have many multiple paths out – not just redundancy but efficient routing. Homes are never going to be that.
That’s why the cloudy aspect is important. You can use redundancy.
Companies are only in favor of reducing energy use if it happens to correspond with lower costs. Otherwise everyone simply does what is the cheapest and makes them the most amount of money. So sadly useless bloatware will continue to ship indefinitely. That’s as opposed to my bloatware which isn’t useless. My Javascript and electron framework -based products are beautiful and elegant.
Put a small nuclear reactor next to the data center and power it for 50 years.

Only if you ignore the decommissioning cost, the radiation damage to the electronics and increased need for replacement parts, and the every-lasting waste produced during every second of its operation.

Only if you ignore the decommissioning cost, the radiation damage to the electronics and increased need for replacement parts, and the every-lasting waste produced during every second of its operation.
And the green goo. Don’t forget the green goo while you are at it.
Seriously, don’t talk if you have no idea what you are babbling about.
“…everything from climate modeling to searching for new drugs”.
So, different ways of digging for cash. Not very different from bitcoin mining, in that it requires an incredibly expensive computer, incurs massive running costs, wastes huge amounts of electricity, and accomplishes absolutely nothing useful.
Will closing a data center increase the amount of energy being used elsewhere? There is a need for these computations. They are going to get done somewhere. The calculations could lower energy use elsewhere via efficiency research, as well.
Thing is, you don’t know. I don’t think anyone does. In the grand scheme of things this is a tiny amount of energy. There are millions of poorly insulated houses around the world that gobble up gigajoules of energy for nothing. There are stores that have the AC cranked an
If they are using the waste heat to heat homes, and the homes would be heated with another source if they didn’t use that waste heat, no, they’re not adding heat to the arctic.
But in any case, the waste heat from energy sources is very small compared to the heat from carbon dioxide, because waste heat is produced once and dissipated, while the carbon dioxide stays in the atmosphere and continues to produce warming.for a time estimated at over a century. The waste heat is really not the problem.

Carbon dioxide is only 0.04% of the atmosphere.

Carbon dioxide is only 0.04% of the atmosphere.
Correct; just over 400 ppm (and rising). https://keelingcurve.ucsd.edu/ [ucsd.edu]

Very low concentration.

Very low concentration.
Not sure what the word “low” means. Even at the 0.04 ppm level, the infrared absorption of carbon dioxide is non-negligible over the pathlength of ~10 km through the atmosphere, and the subsequent reradiation adds heat to the planet. This is well understood, and the infrared absorption of trace gasses in the atmosphere is well measured.

Plants and trees could not exist without it.

Plants and trees could not exist without it.
Correct. Nobody has been proposing a goal of zero carbon dioxide in the atmosphere. The question has been moderating the production of excess carbon dioxide.

We have been putting concrete everywhere eliminating plant life. That is a bigger problem

We have been putting concrete everywhere eliminating plant life. That is a bigger problem
One effect among many, but, yes, paving the planet can contribute to global warming (asphalt probably more than concrete). Collectively, “land use” is believed to contribute about 30% to warming. (However, this is not entirely from deforestation; a lot of this comes from agricultural CO2 emissions.)
Some links: https://www.britannica.com/sci… [britannica.com]
  https://www.nature.com/article… [nature.com]
  https://theconversation.com/un… [theconversation.com]
  https://www.weforum.org/agenda… [weforum.org]
But there’s no rule that we have to pick only one factor to address. “Whatabout growing more plants?” does not preclude “let’s do something about the carbon dioxide emissions from burning fossil fuels”.

and will lead to less oxygen

and will lead to less oxygen
Actually, the amount of oxygen is not currently a problem. Maybe eventually, but not this century.

Even if it’s heating homes it’s essentially heating the atmosphere and using energy. Population and people are the cause of warming.

Even if it’s heating homes it’s essentially heating the atmosphere and using energy. Population and people are the cause of warming.
The people are going to live there anyway and they do need heat in their homes. Closer to a net win than trying to air condition the desert, which is where some of the big datacenters are going.

Population and people are the cause of warming.

Population and people are the cause of warming.
True if you change it to “Population and people are the cause of anthropogenic warming”.
Of course, “population” is just an impressive Latinate word for “people”. And to say that people are the cause of anthropogenic warming is a perfect tautology.
The actual cause of virtually all climate warming would be the Sun, with a teensy bit of help from volcanoes.
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
Microsoft AI Suggests Food Bank As a ‘Cannot Miss’ Tourist Spot In Canada
SUSE To Flip Back Into Private Ownership
VMS is like a nightmare about RXS-11M.

source

Leave a Reply

Your email address will not be published. Required fields are marked *

This field is required.

This field is required.