energy – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Wed, 28 Aug 2024 14:46:03 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/DC-logo-emblem_multicolor-75x75.png energy – Dataconomy https://dataconomy.ru 32 32 The AI and bitcoin mining clash for U.S. power has begun https://dataconomy.ru/2024/08/28/the-ai-and-bitcoin-mining-clash-for-u-s-power-has-begun/ Wed, 28 Aug 2024 14:46:03 +0000 https://dataconomy.ru/?p=57381 A major energy struggle is unfolding in the United States, as big tech companies and cryptocurrency miners clash over power supplies. As artificial intelligence (AI) and cloud computing data centers grow rapidly, they are competing fiercely with bitcoin mining for electricity. This competition is changing the way energy is used and who gets access to […]]]>

A major energy struggle is unfolding in the United States, as big tech companies and cryptocurrency miners clash over power supplies. As artificial intelligence (AI) and cloud computing data centers grow rapidly, they are competing fiercely with bitcoin mining for electricity. This competition is changing the way energy is used and who gets access to it.

The new power struggle

According to Reuters, U.S. tech giants are snapping up energy resources from bitcoin miners to fuel their expanding AI and cloud computing centers. These data centers are experiencing a huge rise in electricity demand, which is expected to reach up to 9% of all U.S. electricity by the end of the decade. This is more than double their current usage and is outpacing the growth of power grids. As a result, tech companies like Amazon and Microsoft are scrambling for electricity wherever they can find it. Just recently, Donald Trump pointed out this issue in an interview, too.

This scramble for power is impacting the bitcoin mining industry. Some miners are making significant profits by leasing or selling their energy infrastructure to tech firms, while others are struggling to maintain their operations due to reduced access to electricity.


Is AI green, how sustainable is it?


Bitcoin miners’ new challenges

Bitcoin miners are facing a tough situation. Some are able to make good deals by renting out or selling their power resources, but many are losing access to the energy they need. Greg Beard, CEO of Stronghold Digital Mining, emphasizes the intensity of this competition: “The AI battle for dominance is a battle being had by the biggest and best capitalized companies in the world, and they care like their lives depend on it that they win.” This fierce competition is reshaping the energy market.

Shifting to AI

Bitcoin miners are starting to pivot towards AI and cloud computing, but this transition comes with significant challenges. Analysts predict that by 2027, up to 20% of bitcoin miners’ power capacity might shift to AI. However, turning a bitcoin mining facility into an AI data center is not straightforward. It requires expensive upgrades, such as advanced cooling systems and new infrastructure.

The AI and bitcoin mining clash for U.S. power has begun
(Credit)

The timeline for setting up new AI data centers is also much longer compared to bitcoin mines. While bitcoin mines can be set up in six to twelve months, a high-tech data center may take up to three years. This difference in setup time is crucial for tech companies that need to move quickly.

Financial disparities

The financial resources of tech giants make a big difference in this energy competition. Companies like Amazon have large capital reserves and can afford to invest heavily in acquiring and developing energy resources. In contrast, many bitcoin miners are struggling financially and cannot compete with the financial power of tech giants. For example, Marathon Digital Holdings, the largest publicly traded bitcoin miner, was interested in a nuclear-powered data center but lost out to Amazon in the deal.

Looking ahead

The battle between AI-driven tech companies and bitcoin miners over energy resources is transforming the U.S. energy landscape. As data centers and cryptocurrency mining vie for power, the energy market is evolving rapidly. Technology companies are investing heavily in securing energy assets, while bitcoin miners face the challenge of adapting to this new competitive environment.

As this energy race continues, the way these two sectors interact will shape the future of energy use and availability. The outcome of this competition will have lasting effects on both industries and the overall energy market.


Featured image credit: Eray Eliaçık/Bing

]]>
Energy management in manufacturing: Trends, innovations and modern solutions https://dataconomy.ru/2023/10/04/energy-management-in-manufacturing-trends-innovations-and-modern-solutions/ Wed, 04 Oct 2023 08:37:35 +0000 https://dataconomy.ru/?p=42788 Faced with an irreversible energy transition in line with climate policy, the need to reduce emissions, and increasing regulation, the industrial sector is under intense pressure to implement change. On this topic, we speak with Jakub Kaczyński, an expert in digital solutions for industry from Transition Technologies PSC. You have been involved in Industry 4.0, […]]]>

Faced with an irreversible energy transition in line with climate policy, the need to reduce emissions, and increasing regulation, the industrial sector is under intense pressure to implement change. On this topic, we speak with Jakub Kaczyński, an expert in digital solutions for industry from Transition Technologies PSC.

You have been involved in Industry 4.0, and the Industrial Internet of Things (IIoT) for more than 10 years, what is the biggest challenge for manufacturing companies?

JK: Over the past two years we have seen, the focus of businesses on reducing costs related to the consumption of energy utilities. Now, after a relative stabilization in this area, the issue of ESG reporting is coming to the fore. An example could be the European Green Deal and the European implementing acts that follow it are imposing ever newer requirements on businesses to achieve specific sustainability results. In July 2023, regulations on reporting standards (ESRS) came into force in accordance with the CSRD. The European strategy in this area is ambitious, with a target to reduce CO2 emissions by 55% by 2030 and net-zero by 2050. On the other hand, it is estimated that cumulative public investment in this direction will reach €1.7 trillion by 2030.

The future of industrial energy management is based on many variables. So are we able to identify trends and list innovations that are driving the energy market?

JK: Observing customer needs, I would identify two main aspects to consider and three technological areas that will affect energy management.

In running and growing a business, the most important thing is flexibility, “agility” – and data-driven decision-making. This makes it possible to identify necessary actions and implement them. Considering how quickly and often the business, regulatory, and social environment changes – all solutions in the ESG area must be characterized by high flexibility and adaptability. This allows them to fit perfectly into the specifics of a given enterprise, and not the other way around. It also allows for decision-making by connecting existing data silos within organizations.

The technologies that support systems to meet the above requirements are the Industrial Internet of Things, Machine Learning, and the public cloud. The former enables data from different sources to be combined and integrated in real-time, providing flexibility. Machine Learning provides more effective analysis and prediction based on collected data. The cloud gives access to scalable and flexible storage and processing of large amounts of data. Together, they provide the opportunity for even better business value realization.

An example of such synergy is the energy management system Energy Advisor for Manufacturing. This system enables manufacturing companies to control energy consumption and its costs during the production process. The tool not only collects the necessary data directly from production lines but also from production support systems. It allows them to be visualized from the perspective of historical changes. It’s a solution that not only supports entrepreneurs in the present, but also equips them with excellent, flexible tools that will help the organization respond better to changes in the energy market and new regulations.

Energy management in manufacturing: Trends, innovations and modern solutions

The system you mentioned combines energy efficiency with the tenets of sustainability. What benefits does Energy Advisor bring to the energy management process at a manufacturing plant?

JK: At the outset, it’s worth noting that the results that can be obtained after implementing this solution depend on, among other things, the internal change management processes of the organization and the support of management. Benefits, on the other hand, can be divided into several areas.

First is awareness building – making visible where in the process, which lines, machines and which products are most responsible for the consumption of energy utilities. This helps avoid unnecessary costs, such as financial penalties for exceeding so-called contracted power.

Second is optimization. With all the necessary data, changes can be identified and planned, starting with the most important ones.  Knowing the cost of utilities needed to produce a unit of a product, and the energy intensity of individual lines, gives you the ability to make decisions about changes to your product portfolio or production plan.

The third area is the “what-if” analysis on costs and energy carriers used, leading to diversification of the energy mix or tariff changes. This not only provides opportunities for further cost optimization, but also greater flexibility and streamlines the process of adapting to changing business conditions.

Fourth addresses change management in sustainability and energy efficiency initiatives. The ability to start targeted initiatives in a specific area, but also to measurably track their progress.

The fifth and final area is ESG reporting. Tracking changes and progress and collecting the data necessary to meet non-financial reporting obligations.

As an industrial digitization expert, what barriers or concerns, when implementing new energy management solutions, do you encounter most often?

JK: The first barrier is the multitude of options available. At every turn, one encounters numerous solutions presented as “modern” and “the best”. It’s difficult to find the right one in this thicket of promises.

The second barrier is the implementation process. If it’s not prepared in the right way, time is not provided for adaptation and handover of the system to key users – the whole venture may end in failure.

What is your advice to companies facing this challenge?

JK: Find the right technology partner who will adapt to the company’s requirements will not want to revolutionize the current state of production and will present a sensible plan of action. It’s important to remember that industrial companies are not technology companies. For them, the choice of a particular solution, tool, or technology is a secondary thing – in view of the business goal to be achieved. That’s why many such companies are looking for a trusted technology partner who can support them in selecting and creating the right solution that addresses specific requirements and challenges. It’s from such experiences that the Smart Factory: Explained series was created, during which we explain the principles of Industry 4.0 in the form of short videos.

In today’s dynamic business environment, digital transformation is already an integral part of companies’ strategies. Innovative technological developments are not only transforming business processes and models but also creating a new era of customer interaction, opening the door to endless growth prospects.


Featured image credit: Federico Beccari/Unsplash

]]>
Why we need AI to power the green energy transition https://dataconomy.ru/2022/08/23/why-we-need-ai-to-power-the-green-energy-transition/ https://dataconomy.ru/2022/08/23/why-we-need-ai-to-power-the-green-energy-transition/#respond Tue, 23 Aug 2022 15:26:17 +0000 https://dataconomy.ru/?p=27793 Today we see clear movement and momentum to decarbonization and the green energy transition. In parallel, the rise in digital technology and advanced analytics provide unique opportunities to not only migrate to new energy technologies, but to monitor progress, predict performance, integrate systems, ensure reliability and resiliency – and improve sustainability by optimizing products, solutions, […]]]>

Today we see clear movement and momentum to decarbonization and the green energy transition. In parallel, the rise in digital technology and advanced analytics provide unique opportunities to not only migrate to new energy technologies, but to monitor progress, predict performance, integrate systems, ensure reliability and resiliency – and improve sustainability by optimizing products, solutions, and services like never before.

At the same time, we have changing dynamics in the sector that increase its complexity. Grids are moving from centralized to decentralized models. Energy producers have multi-OEM (original equipment manufacturer) solutions that must be monitored as a system to ensure uptime and output. Venture capital is increasing and there are many new entrants in the market, disrupting different pockets of value creation. Governments, activist investors, and communities are increasing pressure for transparency of ESG metrics along value chains.

Easy data access among different stakeholders is a key element that will foster competitiveness while maintaining equitable participation along the entire energy value chain. The markets and infrastructure of different sectors will be strongly connected in the future. For this reason, secure and trustworthy data sharing is needed to leverage innovation within and between sectors.

However, the energy industry has been slow to adopt modern digital technologies, and due to its key role as critical infrastructure, can be risk adverse. We see that the transition to digital will be slowed by poor quality data, inaccurate, or missing data, lack of modern data architectures, and often tight and restricted- or hard to find – data.  Optimizing the energy system will require much better digital information, data transparency, and open standards, while ensuring appropriate security and data protection measures. Cyber security is an absolute must to build trust, confidence, and resiliency in grid stability as well as information flows.

To support these changes, standards and regulations are needed to fuel compatibility and interoperability; digitize information exchange; simplify product development; speed time-to-market for solutions; and improve transparency and trust. 

AI’s role in transforming the global energy landscape

One thing is certain about the future: interactions among energy systems will become substantially more complex. The laundry list of major challenges we face includes decarbonization, decentralization, energy storage, waste reduction and smart maintenance. Overcoming these challenges will require inventive ways of thinking that extend far beyond the approaches that we have traditionally applied in engineering. Artificial intelligence (AI) methods and frameworks will form the cutting edge of our efforts to overcome these intricate challenges.

To successfully master the tremendous challenges posed by the energy transformation, we need to go beyond incremental changes and come up with new and transformative innovations that transcend traditional engineering.

AI is the expert for the job, the technology that is just right for the huge amounts of data being generated today along all parts of the value chains, together with ever increasing computational resources. For example, machine learning methods allow it to systematically tailor products, solutions and services to the specific needs. AI-based solutions also greatly help handle the increasing complexity of energy systems resulting from decarbonization and decentralization Furthermore, they also allow for improved prediction of hardware durability to optimize maintenance cycles leading to reduced waste. By using AI, efficiency and reliability of power plants can be increased, emissions can be reduced, and usage of material optimized – all contributing to higher sustainability. By implementing self-optimized processes in manufacturing, delivery times can be optimized and the autonomous operation of power plants enables higher safety and improved grid stability through more efficient electricity generation.

The importance of ‘open data’ to society

The concept of ‘open data’ has existed for over a decade and has supported everything from the plethora of navigation solutions, to transparency on government spending, to innovation with emerging applications in automotive.  When certain data sets are entered into the ‘public domain’ we see innovation flourish in often unanticipated ways to forward society. That said, clearly, we must balance the need for the public good with the real concerns that companies have regarding intellectual property, revenue generation opportunities, and customer consent and trust.  

Why sectoral standards should be created for all ESG measures

There should absolutely be standards in place for ESG measures, including scopes 1-3. It is in the public interest to have transparency and trust in the data reported, and how data are measured and calculated. Without standards, there is increased burden and risk to the public interest that information reported by multiple companies is not comparable. We saw this, for instance, with Covid-19 reporting, and various countries reporting statistics in ways that were not easily comparable country by country without extra legwork. 

The biggest challenge is in tracking Scope 3, company supply chains. Whether it’s packaging, agriculture, manufacturing, or other suppliers, attention will continue to turn to this value chain.  Introducing science-based standards will allow trust and transparency of those numbers, while reducing the cost burden to companies (especially small and medium sized businesses).

Accelerating the transformation with financial investment

From the data perspective, building up and retaining data and AI competencies is essential to keep Europe a region in leading-edge technologies. This process spans early education, academia, and reskilling. To achieve this, close collaboration between public institutions and industry is needed. This can be driven through co-funding of research programs as well as supplying funding for data science and AI tracks at universities at all education levels.

Venture capital and startup funding are also important for building out an ecosystem of startups which continue to fuel innovation in areas such as battery storage, AI, additive manufacturing, sensor technology, and other technologies critical to digital. 

Ensuring a balanced view between industry and public good

There is no person, no company, and no government immune to the effects of climate change. Therefore, we all have a stake in identifying solutions that transition and decarbonize to net zero as quickly as possible. Digital and AI will power the solutions of the future, but industry needs government support in setting the standards that ease the path and transition forward. Governments should work in partnership with industry and other stakeholders in developing standards that ensure the goals are reached without too much burden or by sidestepping the goals all together.

We have seen this approach be successful in automotive, for example, with Safety Related Traffic Information (SRTI). [https://www.acea.auto/press-release/data-for-road-safety-moves-from-proof-of-concept-to-long-term-deployment/?/press-releases/article/data-for-road-safety-moves-from-proof-of-concept-to-long-term-deployment]

It is also important, however, to provide incentive to industry to share their intellectual property and create opportunities for value creation.

Positioning the EU as the leader in standards setting

General Data Protection Regulation (GDPR) was ground-breaking when it was released and has since become the bell weather for privacy standards. It is often the default standard that many global companies use when managing their customers sensitive data around the world, as it provides the ability to ensure compliance while reducing complexity in the applications and systems.

In a similar way, the EU can take the leadership position on setting data and digital standards to drive interoperability to support the energy transition. To complement this, a European Standardization Framework on AI workflow development and implementation is needed.

Learning from other industries  

In addition to some of the examples above, there are examples all around us. Our ability to move money easily across countries, internets standards and the rise of ecommerce, and shipping container standards for improved transparency in logistics. There are usually some good examples of another industry doing something well which we can build upon and tailor. It’s then important to understand what we can learn from them; how can we model based on what has been proven to work; and how can we accelerate quickly with policy, investment, standards, and technologies as core pillars?


This article is published by www.jgde.org, it is copyrighted by the www.jgde.org and is published on Dataconomy with their permission.

]]>
https://dataconomy.ru/2022/08/23/why-we-need-ai-to-power-the-green-energy-transition/feed/ 0
Study: Dealing with increasing power needs of ML https://dataconomy.ru/2022/06/03/power-capping-ml-power-efficiency/ https://dataconomy.ru/2022/06/03/power-capping-ml-power-efficiency/#respond Fri, 03 Jun 2022 15:03:57 +0000 https://dataconomy.ru/?p=24754 A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference and several different methods to reduce AI energy use in light of growing concern over huge machine learning models’ energy demands. Power capping can significantly reduce energy […]]]>

A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference and several different methods to reduce AI energy use in light of growing concern over huge machine learning models’ energy demands.

Power capping can significantly reduce energy usage when training ML

The study’s major problem focuses on power capping (cutting off the available power to the GPU training the model). They think power capping results in significant energy savings, especially for Masked Language Modeling (MLM) and frameworks like BERT and its descendants. Language modeling is a rapidly growing area. Did you know that Pathways Language Model can explain a joke?

Similar cost savings may be had due to reduced training time and energy usage for larger-scale models, which have grabbed people’s attention in recent years owing to hyperscale data and new models with billions or trillions of parameters.

For bigger deployments, the researchers found that lowering the power limit to 150W produced an average 13.7% reduction in energy usage and a modest 6.8% increase in training time compared to the standard 250W maximum. If you want to dig into more detail, find out how to manage the machine learning lifecycle by reading our article.

A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference, as well as several different methods to reduce AI energy use in light of growing concern over huge machine learning models' energy demands.
Researchers think that power capping results in significant energy savings, especially for Masked Language Modeling.

The researchers further contend that, despite the headlines about the cost of model training in recent years, the energy requirements of utilizing those trained models are significantly greater.

“For language modeling with BERT, energy gains through power capping are noticeably greater when performing inference than training. If this is consistent for other AI applications, this could have significant ramifications in energy consumption for large-scale or cloud computing platforms serving inference applications for research and industry.”

Finally, the study claims that extensive machine learning training should be limited to the colder months of the year and at night to save money on cooling.

A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference, as well as several different methods to reduce AI energy use in light of growing concern over huge machine learning models' energy demands.
For language modeling with BERT, energy gains through power capping are noticeably greater when performing inference than training.

“Evidently, heavy NLP workloads are typically much less efficient in the summer than those executed during winter. Given the large seasonal variation, if there, are computationally expensive experiments that can be timed to cooler months this timing can significantly reduce the carbon footprint,” the authors stated.

The study also recognizes the potential for energy savings in optimizing model architecture and processes. However, it leaves further development to other efforts.

Finally, the authors advocate for new scientific papers from the machine learning industry to end with a statement that details the energy usage of the study and the potential energy consequences of adopting technologies documented in it.

The study titled “Great Power, Great Responsibility: Recommendations for Reducing Energy for Training Language Models” was conducted by six researchers Joseph McDonald, Baolin Li, Nathan Frey, Devesh Tiwari, Vijay Gadepally, Siddharth Samsi from MIT Lincoln and Northeastern University.

How to create power-efficient ML?

To achieve the same level of accuracy, machine learning algorithms require increasingly large amounts of data and computing power, yet the current ML culture equates energy usage with improved performance.

According to a 2022 MIT collaboration, achieving a tenfold improvement in model performance would need a 10,000-fold increase in computational requirements and the same amount of energy.

As a result, interest in more power-efficient effective ML training has grown in recent years. According to the researchers, the new paper is the first to focus on the influence of power constraints on machine learning training and inference, with particular emphasis paid to NLP approaches.

“[This] method does not affect the predictions of trained models or consequently their performance accuracy on tasks. That is, if two networks with the same structure, initial values, and batched data are trained for the same number of batches under different power caps, their resulting parameters will be identical, and only the energy required to produce them may differ,” explained the authors.

A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference, as well as several different methods to reduce AI energy use in light of growing concern over huge machine learning models' energy demands.
The experiments indicate that implementing power capping can significantly reduce energy usage. (Image Credit)

To evaluate the impact of power capping on training and inference, the researchers utilized Nvidia-smi (System Management Interface) and a HuggingFace MLM library.

The researchers trained BERT, DistilBERT, and Big Bird using MLM and tracked their energy usage throughout training and deployment.

For the experiment, DeepAI’s WikiText-103 dataset was used for four epochs of training in batches of eight on 16 V100 GPUs, with four different power capping: 100W, 150W, 200W, and 250W (the default or baseline for an NVIDIA V100 GPU). To guard against bias during training, scratch-trained parameters and random init values were used.

As demonstrated in the first graph, with favorable changes in training time and non-linear, a great amount of energy savings may be achieved.

“Our experiments indicate that implementing power caps can significantly reduce energy usage at the cost of training time,” said the authors.

The authors then used the same method to tackle a more challenging problem: training BERT on dispersed configurations of numerous GPUs, which is a more typical case for well-funded and well-publicized FAANG NLP models.

The paper states:

“Averaging across each configuration choice, a 150W bound on power utilization led to an average 13.7% decrease in energy usage and 6.8% increase in training time compared to the default maximum. [The] 100W setting has significantly longer training times (31.4% longer on average). The authors explained that a 200W limit corresponds with almost the same training time as a 250W limit but more modest energy savings than a 150W limit,” explained the authors.

The researchers determined that these findings support the notion of power-capping GPU architectures and applications that run on them at 150W. They also noted that energy savings apply to various hardware platforms, so they repeated the tests to see how things fared for NVIDIA K80, T4, and A100 GPUs.

Inference requires a lot of power

Despite the headlines, it’s inference (i.e., utilizing a completed model, such as an NLP model) rather than training that has the greatest amount of power according to prior research, implying that as popular models are commercialized and enter the mainstream, power usage might grow more problematic than it is at this early phase of NLP development.

The researchers quantified the influence of inference on power usage, finding that restricting power use has a significant impact on inference latency:

“Compared to 250W, a 100W setting required double the inference time (a 114% increase) and consumed 11.0% less energy, 150W required 22.7% more time and saved 24.2% the energy, and 200W required 8.2% more time with 12.0% less energy,” explained the authors.

A recent research from MIT Lincoln Laboratory and Northeastern University has investigated the savings that can be made by power capping GPUs used in model training and inference, as well as several different methods to reduce AI energy use in light of growing concern over huge machine learning models' energy demands.
The paper’s authors propose that training might be done at peak Power Usage Effectiveness (PUE).

The importance of PUE

The paper’s authors propose that training might be done at peak Power Usage Effectiveness (PUE), roughly in the winter and night when the data center is most efficient.

“Significant energy savings can be obtained if workloads can be scheduled at times when a lower PUE is expected. For example, moving a short-running job from daytime to nighttime may provide a roughly 10% reduction, and moving a longer, expensive job (e.g., a language model taking weeks to complete) from summer to winter may see a 33% reduction. While it is difficult to predict the savings that an individual researcher may achieve, the information presented here highlights the importance of environmental factors affecting the overall energy consumed by their workloads,” stated the authors.

Finally, the paper suggests that because local processing resources are unlikely to have implemented the same efficiency measures as big data centers and high-level cloud computing players, transferring workloads to regions with deep energy investments may provide environmental benefits.

“While there is convenience in having private computing resources that are accessible, this convenience comes at a cost. Generally speaking, energy savings and impact are more easily obtained at larger scales. Datacenters and cloud computing providers make significant investments in the efficiency of their facilities,” added the authors.

This is not the only attempt to create power-efficient machine learning and artificial intelligence models. The latest researches show that nanomagnets will pave the way for low-energy AI.

]]>
https://dataconomy.ru/2022/06/03/power-capping-ml-power-efficiency/feed/ 0
Apple Invests in Renewables https://dataconomy.ru/2014/04/14/apple-invests-renewables-4/ https://dataconomy.ru/2014/04/14/apple-invests-renewables-4/#respond Mon, 14 Apr 2014 10:31:33 +0000 https://dataconomy.ru/?post_type=news&p=1875 Apple has recently purchased a hydroelectric power plant near one of its data centers in central Oregon.  The plant, located in Prineville, will primarily be used to power things like iCloud and will be used to support a number of data centres across the country. With the advent of Apple investing heavily in massive solar […]]]>

Apple has recently purchased a hydroelectric power plant near one of its data centers in central Oregon.  The plant, located in Prineville, will primarily be used to power things like iCloud and will be used to support a number of data centres across the country.

With the advent of Apple investing heavily in massive solar farms in North Carolina to power one of its data centres there, and their claims that 100% of the energy they use is from renewable resources, this foray into hydroelectric power should come as no real surprise.

The company is cutting out the middle man supplying the clean energy on which they run their data centres, and its only a matter of time before they become entirely self-sufficient.

Read more here

(Image Credit: Bob Mical)

]]>
https://dataconomy.ru/2014/04/14/apple-invests-renewables-4/feed/ 0