storage – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Mon, 07 Oct 2024 12:58:20 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/DC-logo-emblem_multicolor-75x75.png storage – Dataconomy https://dataconomy.ru 32 32 New iCloud storage plans got more applause than iPhone 15 https://dataconomy.ru/2023/09/13/new-icloud-storage-plans-12tb-icloud-6tb/ Wed, 13 Sep 2023 14:21:21 +0000 https://dataconomy.ru/?p=41675 The unveiling of these new iCloud storage plans (iCloud 6TB, iCloud 12TB) was met with perhaps the most resounding applause of the entire event, signaling just how crucial storage has become in our increasingly digital lives. This robust response underscores the growing need for expansive cloud storage solutions that can accommodate the evolving demands of […]]]>

The unveiling of these new iCloud storage plans (iCloud 6TB, iCloud 12TB) was met with perhaps the most resounding applause of the entire event, signaling just how crucial storage has become in our increasingly digital lives. This robust response underscores the growing need for expansive cloud storage solutions that can accommodate the evolving demands of consumers.

During its highly anticipated event, Apple discreetly unveiled an exciting expansion to its iCloud+ cloud storage subscription. With the introduction of two new subscription tiers, the firm now offers New iCloud storage plans ranging from ample to downright extravagant.

New iCloud storage plans: Photographers and filmmakers rejoice

New iCloud storage plans, boasting a whopping 6 terabytes and an astonishing 12 terabytes of storage, have sent ripples of excitement through the tech community. While the average user may not require such colossal storage capacity, these plans are poised to be a game-changer for creative professionals, particularly photographers and filmmakers.

Apple’s decision to enhance its cloud storage offerings coincides with the significant camera upgrades featured in the iPhone 15, making it a match made in tech heaven for content creators. With the ability to capture high-resolution photos and videos, creative individuals can now securely store their work without the constraints of limited storage.

Pricing that makes sense

Apple’s existing iCloud+ subscription plans offer 50 GB for $0.99 per month, 200 GB for $2.99 per month, and 2 TB for $9.99 per month. Based on this pricing structure, industry experts speculate that the new 6 TB and 12 TB subscriptions will be available at a maximum cost of $30 and $60 per month, respectively. This pricing aligns closely with Google’s cloud storage options, where 5 TB and 10 TB plans are offered at $24.99 and $49.99 per month, respectively. This competitive pricing ensures that Apple remains in the same league as its tech counterparts.

Discover Apple's New iCloud storage plans (iCloud 6TB and iCloud 12TB) for creatives, competitive pricing, and enhanced privacy features
New iCloud storage plans (Image credit)

Google antitrust lawsuit 2023: Visit the article and explore what is happening right now


Privacy features galore

Subscribers to New iCloud storage plans gain access to expanded storage and enjoy enhanced privacy features. Apple is renowned for its commitment to user privacy, and iCloud+ subscribers can benefit from features such as Hide My Email and Private Relay. These privacy-centric tools provide users with peace of mind, knowing that their data remains secure and their online activities are shielded from prying eyes.

In conclusion, Apple’s latest iCloud+ storage offerings mark a significant milestone in the evolution of cloud storage services. Catering to the needs of creative professionals and storage-hungry users, these new plans have the potential to redefine how we perceive and utilize cloud storage. With competitive pricing and robust privacy features, iCloud+ continues to solidify Apple’s reputation as a tech industry leader. Whether you’re a creative professional or simply a digital enthusiast, Apple’s iCloud+ now offers the perfect solution to safeguard your precious data while enabling your digital endeavors.

Featured image credit: Apple

]]>
Mastering the art of storage automation for your enterprise https://dataconomy.ru/2023/03/17/what-is-storage-automation/ Fri, 17 Mar 2023 11:00:15 +0000 https://dataconomy.ru/?p=34484 Storage automation involves utilizing software and tools to automate storage tasks, which results in decreased manual labor and improved efficiency. In today’s era of data-driven business, managing storage infrastructure can be a complex and time-consuming process. With the growing volume and complexity of data, manual storage management tasks are becoming increasingly challenging, which can lead […]]]>

Storage automation involves utilizing software and tools to automate storage tasks, which results in decreased manual labor and improved efficiency.

In today’s era of data-driven business, managing storage infrastructure can be a complex and time-consuming process. With the growing volume and complexity of data, manual storage management tasks are becoming increasingly challenging, which can lead to inefficiencies, errors, and increased costs.

However, storage automation offers a solution to these challenges, enabling organizations to manage and optimize their storage infrastructure more efficiently and effectively. Storage automation can significantly impact businesses, enabling them to leverage data more effectively and adopt agile tactics to meet the needs of a rapidly changing business environment.

What is storage automation?

Storage automation refers to the use of software tools and technologies to automate the process of managing storage resources, such as data, files, and applications, across a wide range of storage devices and systems.

Why storage automation is important?

Storage automation has become an essential component of modern IT operations because it offers numerous benefits, including:

  • Improved efficiency: Storage automation enables IT teams to perform routine tasks more quickly and accurately, reducing the time and effort required to manage storage resources.
  • Greater scalability: As data volumes continue to grow, manual storage management becomes increasingly challenging. Automation can help IT teams scale their storage infrastructure without requiring additional personnel.
  • Increased reliability: By automating storage management tasks, IT teams can reduce the risk of human error, which can lead to data loss, system downtime, and other issues.
  • Better cost management: Storage automation can help organizations optimize their storage resources, ensuring that they are using their hardware and software investments effectively.
What is storage automation
By optimizing storage capacity and usage, organizations can reduce the need for additional storage hardware and software investments

How does storage automation work?

Storage automation involves the use of software tools and technologies to automate various storage management tasks, such as:

  • Provisioning: Storage automation can be used to provision storage resources, such as allocating storage capacity, creating new volumes, and setting access controls.
  • Monitoring: Automation tools can monitor storage usage, capacity, and performance, enabling IT teams to identify potential issues before they become problems.
  • Backup and recovery: Storage automation can be used to schedule and execute backup and recovery operations automatically, reducing the risk of data loss in the event of a system failure or other disaster.
  • Data migration: Storage automation tools can be used to move data between different storage devices or systems, enabling organizations to optimize their storage infrastructure and reduce costs.
  • Policy management: Automation can help organizations enforce storage policies, such as retention periods, access controls, and data encryption, ensuring that data is managed in a compliant and secure manner.

Storage automation helps IT teams to simplify and streamline storage management tasks, enabling them to focus on more strategic initiatives and improving the overall efficiency and reliability of their IT operations.


The significance of office automation in today’s rapidly changing business world


Benefits of storage automation

Storage automation offers numerous benefits for organizations of all sizes and industries, including:

Increased efficiency

Automation can significantly improve the efficiency of IT operations by automating routine storage management tasks, such as data backup and recovery, provisioning and allocation of storage resources, and monitoring of storage usage and performance. By automating these tasks, IT teams can focus on more strategic initiatives that add value to the business, such as innovation and digital transformation.

Reduced risk of errors

Manual storage management can be prone to human error, which can lead to data loss, system downtime, and other issues. Storage automation reduces the risk of such errors by automating storage management tasks and ensuring consistency across the IT environment. Automated processes also help to eliminate the potential for errors caused by misconfiguration, missed steps, or oversights.

Cost savings

Automation can help organizations reduce costs in several ways:

  • By optimizing storage capacity and usage, organizations can reduce the need for additional storage hardware and software investments.
  • Storage automation can help to reduce the need for manual labor, freeing up IT staff to work on more valuable tasks.
  • By automating storage management tasks, organizations can reduce the risk of downtime and other issues that can be costly to resolve.

Improved security

Storage automation can help organizations improve the security of their storage infrastructure by enforcing policies around data access, retention, and encryption. Automated processes help to ensure that sensitive data is stored and managed in a compliant and secure manner, reducing the risk of data breaches and other security incidents.

Overall, storage automation is essential for organizations that want to improve the efficiency, reliability, and security of their storage infrastructure while reducing costs and freeing up IT staff to focus on more strategic initiatives.

Storage automation in AI and data science

AI and data science are driving tremendous growth in data volumes, complexity, and diversity, posing significant challenges for storage management. Storage automation plays a critical role in enabling organizations to manage and optimize their storage infrastructure to support AI and data science workloads effectively.

Challenges in managing data for AI and data science

Managing data for AI and data science involves several challenges, including:

  • Large data volumes: AI and data science workloads require vast amounts of data, which can be challenging to store and manage.
  • Diverse data types: AI and data science workloads use a wide range of data types, including structured, semi-structured, and unstructured data, which can be difficult to manage efficiently.
  • Data complexity: AI and data science workloads often involve complex data structures, such as graphs and networks, which require specialized storage and management techniques.
  • Data movement: Data for AI and data science workloads must be moved quickly and efficiently between storage devices and systems, which can be challenging to manage manually.
What is storage automation
Storage automation is becoming an essential tool for organizations seeking to manage their storage infrastructure efficiently and effectively

Role of storage automation in AI and data science

Automation plays a critical role in managing data for AI and data science by enabling organizations to:

  • Automate data movement: Storage automation can be used to move data between different storage devices and systems automatically, enabling organizations to optimize their storage infrastructure and reduce costs.
  • Optimize storage capacity: Storage automation can help organizations optimize their storage capacity by identifying unused or underutilized storage resources and reallocating them to where they are needed.
  • Simplify data management: Storage automation can help organizations simplify data management by automating routine storage management tasks, such as data backup and recovery, provisioning and allocation of storage resources, and monitoring of storage usage and performance.
  • Improve data security: Storage automation can help organizations improve the security of their data by enforcing policies around data access, retention, and encryption.

Real-life examples of storage automation in AI and data science

Some real-life examples of storage automation in AI and data science include:

  • Amazon S3: Amazon S3 (Simple Storage Service) is a cloud-based storage service that uses storage automation to manage data for AI and data science workloads. S3 provides automated storage tiering, lifecycle policies, and other features that enable organizations to manage data efficiently and cost-effectively.
  • IBM Spectrum Scale: IBM Spectrum Scale is a high-performance, scalable storage system that uses storage automation to manage data for AI and data science workloads. Spectrum Scale provides automated data movement, tiering, and other features that enable organizations to manage data efficiently and optimize storage resources.
  • NetApp ONTAP: NetApp ONTAP is a storage management platform that uses storage automation to manage data for AI and data science workloads. ONTAP provides automated data tiering, backup and recovery, and other features that enable organizations to manage data efficiently and ensure data availability and security.

Types of storage automation

There are several types of storage automation that organizations can use to optimize their storage infrastructure and improve efficiency. Some of the most common types of storage automation include:

Automated storage tiering

Automated storage tiering is a storage automation technique that involves automatically moving data between different storage tiers based on its usage and value. This enables organizations to optimize their storage infrastructure by storing data on the most cost-effective storage tier that meets its performance and availability requirements.


Unlocking the full potential of automation with smart robotics


Automated backup and recovery

Automated backup and recovery is a storage automation technique that involves automatically backing up data and recovering it in the event of a system failure, disaster, or other data loss event. Automated backup and recovery processes can help organizations ensure the availability and recoverability of their data while reducing the risk of data loss.

Automated data archiving

Automated data archiving is a utomation technique that involves automatically moving data that is no longer actively used to lower-cost storage devices, such as tape drives or cloud-based storage, while still allowing it to be accessed if necessary. This can help organizations free up expensive storage resources for more critical data while still retaining access to older data.

What is storage automation
Automated storage provisioning is a storage automation technique that involves automatically provisioning and allocating storage resources

Automated storage provisioning

Automated storage provisioning is a storage automation technique that involves automatically provisioning and allocating storage resources, such as storage capacity, volumes, and access controls, based on predefined policies and requirements. This can help organizations reduce the time and effort required to manage storage resources while ensuring that the storage infrastructure is optimized and compliant.

Automated capacity management

Automated capacity management is a storage automation technique that involves automatically monitoring and managing storage capacity, usage, and performance, to ensure that storage resources are optimized and available when needed. This can help organizations avoid downtime and performance issues caused by overloading or underutilizing storage resources.

Overall, each type of storage automation offers unique benefits and can help organizations optimize their storage infrastructure and improve the efficiency and reliability of their IT operations. By implementing storage automation techniques, organizations can reduce costs, minimize errors, and focus on more strategic initiatives that add value to the business.

Best practices for implementing storage automation

Implementing storage automation requires careful planning and execution to ensure that it delivers the expected benefits and does not cause any disruptions to IT operations. Here are some best practices for implementing storage automation:

Define your objectives

Before implementing storage automation, it is essential to define your objectives and identify the specific storage management tasks that you want to automate. This will help you select the right storage automation techniques and tools and ensure that your implementation aligns with your business goals.

Identify the right technology

Once you have defined your objectives, it is important to identify the right technology and tools for implementing storage automation. This may involve evaluating various storage automation tools and platforms, considering factors such as their features, scalability, ease of use, and compatibility with your existing IT infrastructure.

What is storage automation
Automated capacity management is a storage automation technique that involves automatically monitoring and managing storage capacity

Evaluate your existing infrastructure

Before implementing storage automation, it is important to evaluate your existing IT infrastructure to identify any potential issues or challenges that may impact your implementation. This may involve assessing your current storage capacity, performance, and usage, as well as identifying any compatibility issues with your existing IT systems and applications.

Ensure data security and compliance

Storage automation can help organizations improve data security and compliance, but it is important to ensure that your implementation is designed with security and compliance in mind. This may involve implementing data encryption, access controls, and other security measures, as well as ensuring that your implementation complies with relevant regulatory requirements and industry standards.

Regularly monitor and fine-tune your automation processes

Finally, it is important to regularly monitor and fine-tune your storage automation processes to ensure that they are working effectively and efficiently. This may involve monitoring storage usage, performance, and capacity, as well as analyzing logs and other data to identify potential issues and opportunities for optimization. Regularly fine-tuning your automation processes can help ensure that your implementation remains effective and aligned with your business goals over time.

By following these best practices, organizations can successfully implement storage automation and achieve the benefits of improved efficiency, reduced risk of errors, cost savings, and improved security.

Final words

Storage automation is becoming an essential tool for organizations seeking to manage their storage infrastructure efficiently and effectively. As the volume and complexity of data continue to grow, manual storage management processes are becoming increasingly challenging and costly. Storage automation enables businesses to adopt agile tactics, improve data security and compliance, and reduce costs while optimizing storage infrastructure. By leveraging storage automation tools and techniques, businesses can gain a competitive edge in today’s data-driven business environment and unlock new opportunities for growth and innovation.


The intersection of technology and engineering: Automation engineers


FAQ

How do I automate my self storage business?

To automate your self storage business, you can use self-storage management software that allows you to automate tasks such as tenant billing, reservations, move-ins and move-outs, and facility maintenance. You can also use smart locks and security cameras to automate access control and monitoring of your facility. Additionally, you can implement online rental and payment options to simplify the rental process for your tenants.

Is a storage business profitable?

Yes, a storage business can be profitable if managed efficiently. Self storage businesses have relatively low operating costs and high-profit margins. The profitability of a storage business depends on several factors, such as the location, size, and occupancy rate of the facility. However, a well-run storage business can provide a steady stream of income.

How much net profit does a storage owner make?

The net profit of a storage owner depends on several factors, such as the location, size, and occupancy rate of the facility, as well as the operating costs. According to industry data, the average net profit for a self storage facility is around 30% of gross revenue. However, this can vary depending on the market and the level of competition.

What are the risks in self-storage?

Some of the risks associated with self-storage include tenant defaults on rent payments, damage to stored goods, theft, and vandalism. There is also a risk of liability if a tenant is injured on the property or if the facility is found to be in violation of safety regulations. Additionally, natural disasters such as floods or fires can pose a risk to the facility and its contents.

Why is self storage so popular?

Self storage has become popular for several reasons. One reason is the increase in population density and the resulting decrease in available living space, which has led to a greater need for storage. Additionally, the rise of e-commerce has created a demand for storage space for inventory and shipping supplies. Finally, the flexibility and convenience of self storage, with 24/7 access and various unit sizes, has made it an attractive option for individuals and businesses alike.

]]>
Where does your data go: Inside the world of blockchain storage https://dataconomy.ru/2023/03/16/where-is-the-blockchain-stored/ Thu, 16 Mar 2023 11:00:39 +0000 https://dataconomy.ru/?p=34443 Where is the blockchain stored? This is not the first time someone asked this question. Blockchain technology has significantly transformed the way we store and manage data, providing a secure and decentralized approach to storing sensitive information. Blockchain storage is based on a distributed ledger system, where a network of nodes maintains a full copy […]]]>

Where is the blockchain stored? This is not the first time someone asked this question. Blockchain technology has significantly transformed the way we store and manage data, providing a secure and decentralized approach to storing sensitive information. Blockchain storage is based on a distributed ledger system, where a network of nodes maintains a full copy of the blockchain. This innovative system results in an extremely resilient and tamper-proof storage solution that is highly resistant to cyber attacks and hacking attempts.

In this article, we will explore the concept of blockchain storage in more detail, looking at how it works, its benefits, risks, and challenges. We will also discuss the relationship between blockchain storage and cryptocurrency and examine some of the key factors that are likely to shape the future of blockchain storage. Whether you are a business owner, an IT professional, or simply interested in the latest developments in technology, this article will provide you with a comprehensive overview of the world of blockchain storage.

What is blockchain technology?

Blockchain is a digital ledger technology that allows for secure and transparent transactions without the need for a central authority. It is essentially a decentralized database that enables users to store and share information in a tamper-proof and immutable manner. The technology was initially introduced in 2008 as the underlying technology behind Bitcoin, the first cryptocurrency, and has since gained widespread adoption in various industries.

How does the blockchain work?

The blockchain works by creating a distributed network of computers (nodes) that work together to verify and validate transactions. Here are the basic steps involved in the process:

  • A user initiates a transaction by sending a request to the network.
  • The network of nodes validates the transaction and checks for any potential fraud or errors.
  • Once the transaction is validated, it is added to a block of transactions along with other transactions that have been recently validated.
  • The block is then added to the existing blockchain, creating a permanent record of the transaction.
  • The transaction is then considered complete, and the user’s account is updated accordingly.

Each block in the blockchain is linked to the previous block, creating a chain of blocks (hence the name “blockchain”). This ensures that any attempts to alter or tamper with a transaction in a block would require the modification of all subsequent blocks, which is virtually impossible. This makes the blockchain a highly secure and tamper-proof technology.

Where is the blockchain stored?
Where is the blockchain stored: Blockchain storage is a decentralized storage system that uses a network of computers to store data securely and immutably

Where is the blockchain stored?

The blockchain is stored on a network of computers (nodes) that participate in the validation and verification of transactions. Each node maintains a copy of the entire blockchain, which is continually updated as new transactions are added to the network. The blockchain can be stored in a decentralized or centralized manner, depending on the type of network and the storage system used.

Decentralized storage

Decentralized storage refers to a system in which the data is distributed across multiple nodes in a network, with each node maintaining a copy of the data. In a decentralized blockchain network, each node stores a copy of the blockchain, creating a distributed ledger that is highly resistant to tampering and hacking. Decentralized storage is a key feature of blockchain technology, as it enables the creation of a transparent, secure, and immutable ledger that is not controlled by a single entity.


Why data redundancy is worth the extra storage space?


Centralized storage

Centralized storage, on the other hand, refers to a system in which the data is stored on a single server or a group of servers controlled by a central authority. This type of storage is commonly used in traditional databases and information systems, where data is accessed and managed by a single entity. However, in the context of blockchain technology, centralized storage is not ideal, as it creates a single point of failure and makes the system vulnerable to hacking and cyber-attacks.

Public vs private blockchain storage

Public and private blockchains differ in terms of their storage systems. Public blockchains, such as Bitcoin and Ethereum, are decentralized and use a distributed network of nodes to store the blockchain. Anyone can join the network and participate in the validation and verification of transactions. Private blockchains, on the other hand, are typically used by organizations and enterprises and are controlled by a central authority. The storage system used in private blockchains can be either centralized or decentralized, depending on the specific needs of the organization.

The future of blockchain storage

As the adoption of blockchain technology continues to grow, the future of blockchain storage is expected to evolve and improve in a number of ways. Here are some possible developments that may shape the future of blockchain storage:

Increased scalability

One of the major challenges facing blockchain technology is its limited scalability. As more transactions are added to the blockchain, the size of the network increases, making it more difficult for nodes to validate and store the data. In the future, new solutions such as sharding and off-chain scaling may help to address this issue, enabling the blockchain to handle more transactions and become more scalable.

Interoperability between blockchains

As the number of blockchain networks continues to grow, there is a need for greater interoperability between different blockchains. This would enable users to transfer assets and data between different networks, creating a more seamless and interconnected blockchain ecosystem.

Where is the blockchain stored?
Where is the blockchain stored: The data stored on a blockchain is organized into blocks, which are cryptographically linked together to form a chain

Improved storage solutions

As blockchain technology continues to mature, new storage solutions are likely to emerge that will make it easier and more cost-effective to store data on the blockchain. For example, new decentralized storage platforms such as IPFS and Filecoin may provide more efficient and secure storage solutions for blockchain data.

Increased use of private blockchains

While public blockchains such as Bitcoin and Ethereum are well-known, private blockchains are also becoming increasingly popular. Private blockchains offer a more controlled and secure environment for businesses and organizations to store data and conduct transactions, and their adoption is likely to increase in the future.

Benefits of blockchain storage

Blockchain technology offers a number of benefits when it comes to storing data, including:

  • Security: Blockchain storage is highly secure, as each transaction is verified and validated by multiple nodes in the network, making it difficult for hackers to tamper with the data. Additionally, the use of cryptography and consensus algorithms ensures that data on the blockchain cannot be modified without the approval of the network.
  • Transparency: The transparent nature of blockchain storage means that all participants in the network can view and verify the data stored on the blockchain. This makes it easier to track and trace the origin of data and ensures that all parties are held accountable for their actions.
  • Immutability: Once data is stored on the blockchain, it cannot be modified or deleted, making it an ideal solution for storing important and sensitive data that needs to be kept secure and immutable.
  • Decentralization: Blockchain storage is decentralized, meaning that it is not controlled by a single entity or organization. This makes it less vulnerable to cyber-attacks and ensures that data is not subject to the whims of a central authority.
  • Efficiency: Blockchain storage is highly efficient, as it eliminates the need for intermediaries and central authorities, streamlining the transaction process and reducing costs.
  • Traceability: The blockchain provides a clear and verifiable record of all transactions, making it easy to track the movement of assets and data on the network.

Risks of blockchain storage

While blockchain technology offers a number of benefits when it comes to storing data, there are also some risks that need to be considered, including:

  • Storage limitations: Blockchain storage can be limited in terms of storage capacity and scalability, which can make it challenging to store large amounts of data.
  • Energy consumption: The process of validating and verifying transactions on the blockchain requires a significant amount of computational power and energy, which can make it an environmentally unsustainable solution.
  • Regulatory challenges: The decentralized and anonymous nature of blockchain technology can make it difficult to regulate and control, which can pose legal and regulatory challenges.
  • Lack of privacy: While blockchain technology provides a transparent and immutable record of transactions, it can also compromise the privacy of users, as all transactions are visible to everyone in the network.
  • Misuse of technology: Like any technology, blockchain can be misused for illegal or unethical purposes, such as money laundering, fraud, or the financing of terrorist activities.

Security of blockchain storage

Blockchain storage is known for its high level of security due to its decentralized, tamper-proof nature. Here are some of the ways that blockchain technology ensures the security of data storage:

  • Cryptography: Blockchain uses advanced cryptographic techniques to ensure that data on the network is secure and tamper-proof. Transactions on the blockchain are secured using complex mathematical algorithms that make it virtually impossible to alter the data without detection.
  • Decentralization: The decentralized nature of the blockchain means that there is no central point of control or authority, making it more difficult for hackers to breach the network. Each node in the network maintains a copy of the blockchain, and any attempts to modify the data would require a consensus among the nodes in the network.
  • Immutability: Once data is added to the blockchain, it cannot be modified or deleted, making it an ideal solution for storing important and sensitive data that needs to be kept secure and immutable.
  • Consensus algorithms: Blockchain uses consensus algorithms to ensure that all nodes in the network agree on the validity of a transaction before it is added to the blockchain. This makes it more difficult for fraudulent or malicious transactions to be added to the network.
  • Private keys: Users on the blockchain are identified by their private keys, which are secured using advanced encryption techniques. Private keys are used to sign transactions, ensuring that only the owner of the key can initiate a transaction on the blockchain.

How to become a blockchain maestro?


How to ensure secure blockchain storage?

While blockchain technology is inherently secure, there are steps that can be taken to ensure that data stored on the blockchain remains secure. Here are some best practices for ensuring secure blockchain storage:

Implement strong access controls

Access to the blockchain should be restricted to authorized users only, and strong access controls should be implemented to ensure that users are who they claim to be. This can be achieved through the use of secure authentication methods, such as two-factor authentication, biometrics, or digital certificates.

Use encryption

All data stored on the blockchain should be encrypted to prevent unauthorized access. This can be achieved through the use of strong encryption algorithms, such as AES or RSA.

Regularly monitor the network

The blockchain network should be regularly monitored for signs of suspicious activity or attempted breaches. This can be achieved through the use of network monitoring tools or security information and event management (SIEM) systems.

Where is the blockchain stored?
Where is the blockchain stored: Blockchain storage can be used for a variety of purposes, including storing cryptocurrency transactions, healthcare records, and digital identities

Maintain up-to-date software

The software used to run the blockchain network should be kept up-to-date with the latest security patches and updates to ensure that any known security vulnerabilities are addressed.

Use reputable service providers

When using third-party service providers to store data on the blockchain, it is important to choose reputable providers with a track record of security and reliability.

Train employees

All employees with access to the blockchain should receive regular training on how to use the technology securely and how to recognize and respond to potential security threats.

Blockchain storage and cryptocurrency

Cryptocurrency is one of the most well-known use cases for blockchain technology, as it relies on the blockchain to securely store and manage transactions. Here are some key points about the relationship between blockchain storage and cryptocurrency:

  • Blockchain is essential for cryptocurrency: Cryptocurrency relies on the blockchain to maintain a decentralized and secure ledger of transactions. Each transaction on the blockchain is recorded in a block, which is then added to the existing chain of blocks, creating an immutable and tamper-proof record of all transactions.
  • Cryptocurrency wallets use blockchain storage: Cryptocurrency wallets are software applications that allow users to store, send, and receive cryptocurrency. These wallets rely on the blockchain to securely store the user’s private keys and manage their cryptocurrency holdings.
  • Cryptocurrency exchanges use blockchain storage: Cryptocurrency exchanges are online platforms that allow users to buy and sell cryptocurrency. These exchanges use the blockchain to securely process and manage transactions, ensuring that all transactions are transparent, secure, and immutable.
  • Different cryptocurrencies use different blockchains: There are a variety of different cryptocurrencies, and each cryptocurrency may use a different blockchain. For example, Bitcoin uses the Bitcoin blockchain, while Ethereum uses the Ethereum blockchain. Each blockchain has its own unique features and capabilities, and may be better suited for different types of transactions.

Blockchain storage is an essential component of cryptocurrency, as it provides the security and transparency needed to create a decentralized and trustworthy system for managing transactions. As the adoption of cryptocurrency continues to grow, the importance of secure and reliable blockchain storage solutions is likely to become even more critical.

Final words

Back to our original question: Where is the blockchain stored? Well, the blockchain is stored on a network of computers (nodes) that participate in the validation and verification of transactions. Each node maintains a copy of the entire blockchain, creating a distributed and decentralized ledger that is highly resistant to tampering and hacking.

One of the key tricks of blockchain storage is its use of cryptography and consensus algorithms to ensure the security and integrity of the data stored on the network. This makes it highly secure and tamper-proof, creating a transparent and immutable ledger that is ideal for storing sensitive and important data.

Additionally, blockchain storage offers a range of benefits, including decentralization, transparency, and efficiency. While there are also some risks and challenges associated with blockchain storage, these can be addressed through the use of appropriate security measures and best practices.

As the adoption of blockchain technology continues to grow, we can expect to see new and innovative solutions emerge that will enable blockchain to become an even more powerful and transformative technology for data storage and management.

FAQ

Is blockchain stored in the cloud?

Blockchain can be stored in the cloud, but it is not limited to this storage solution. The blockchain is essentially a distributed ledger that is stored on a network of computers (nodes) that work together to verify and validate transactions. The nodes can be located anywhere in the world, and the blockchain can be stored on a combination of cloud-based and on-premises storage solutions.

Where is the blockchain stored?
Where is the blockchain stored: Blockchain storage is more secure than traditional storage methods because it is resistant to hacking and tampering

Does blockchain have a database?

Yes, blockchain technology is essentially a type of database, but it differs from traditional databases in a number of ways. Unlike traditional databases, which are typically centralized and controlled by a single entity, the blockchain is decentralized and distributed across a network of nodes. Additionally, the blockchain is designed to be highly secure and tamper-proof, using cryptography and consensus algorithms to ensure the integrity of the data stored on the network.


How data engineers tame Big Data?


Who stores the data in the blockchain?

The data stored on the blockchain is maintained by a network of nodes, which are responsible for validating and verifying transactions. Each node in the network maintains a copy of the entire blockchain, creating a distributed ledger that is highly resistant to tampering and hacking. The data stored on the blockchain is secured using cryptography and consensus algorithms and can only be modified with the agreement of the nodes in the network.

Is blockchain stored on every computer?

Yes, every node in the blockchain network maintains a copy of the entire blockchain, creating a distributed and decentralized ledger that is highly secure and resistant to tampering. This means that the blockchain is stored on every computer that participates in the network, creating a highly redundant and fault-tolerant storage solution. The use of multiple nodes ensures that the blockchain remains accessible even if one or more nodes go offline or become compromised.

]]>
Enterprise cloud storage is the foundation for a successful remote workforce https://dataconomy.ru/2022/12/06/enterprise-cloud-storage-types-advantages/ https://dataconomy.ru/2022/12/06/enterprise-cloud-storage-types-advantages/#respond Tue, 06 Dec 2022 13:07:35 +0000 https://dataconomy.ru/?p=32538 The widespread adoption of enterprise cloud storage has the potential to change the way that businesses operate in the future fundamentally. As more and more organizations move their data and applications to the cloud, the need for on-premises IT infrastructure will decrease, and businesses will become increasingly reliant on cloud-based services. This could lead to […]]]>

The widespread adoption of enterprise cloud storage has the potential to change the way that businesses operate in the future fundamentally. As more and more organizations move their data and applications to the cloud, the need for on-premises IT infrastructure will decrease, and businesses will become increasingly reliant on cloud-based services.

This could lead to the emergence of new business models and ways of working, such as fully remote or distributed teams. Additionally, the ability to easily access and share data from anywhere with an internet connection could lead to increased collaboration and productivity, as well as the development of new, data-driven technologies and services. Ultimately, the widespread adoption of enterprise cloud storage could help drive the continued growth and evolution of the digital economy.

What is enterprise cloud storage?

Enterprise cloud storage refers to businesses’ use of cloud-based storage solutions to store, access, and manage their data. These solutions typically involve using remote servers hosted by a third-party provider and accessed over the internet via a secure network connection.

What is enterprise cloud storage: Types, advantages, disadvantages and more
Many businesses are turning to enterprise cloud storage to improve their data management and collaboration capabilities

The use of cloud storage allows businesses to avoid the need for expensive on-premises storage infrastructure. It allows them to scale their storage capacity up or down as needed easily. Additionally, enterprise cloud storage often offers enhanced security for data compared to traditional storage methods, making it a popular choice for many organizations.

4 types of cloud storage

The four main types of cloud storage are public, private, hybrid, and community cloud storage. Each of these types of cloud storage has unique characteristics and benefits, and the right choice for a particular organization will depend on its specific needs and requirements. Here is a brief overview of each of the four types of cloud storage:

  • Public cloud storage: Public cloud storage involves using remote servers owned and operated by a third-party provider and made available to the general public. Public cloud storage is typically the most cost-effective option, as users only pay for their storage capacity and services.
  • Private cloud storage: Private cloud storage involves using remote servers owned and operated by a single organization and not made available to the general public. Private cloud storage gives organizations greater control and security over their data, but it can also be more expensive than public cloud storage.
  • Hybrid cloud storage: Hybrid cloud storage involves a combination of on-premises storage infrastructure and cloud-based storage services. This allows organizations to take advantage of the benefits of both private and public cloud storage and can provide them with greater flexibility and scalability.
  • Community cloud storage: Community cloud storage involves using remote servers shared by a group of organizations with similar requirements and needs. Community cloud storage can provide organizations with the benefits of both public and private cloud storage, allowing them to share the cost of the underlying infrastructure.

How does cloud storage work?

Cloud storage is a data storage solution provided by a third-party service provider and accessed over the internet. The service provider maintains and manages a network of servers that are used to store the company’s data. When an employee wants to access the company’s data, they connect to the service provider’s servers over the internet using a web-based interface or a specialized client application.


How object storage helps address unstructured data’s increased security risks


The service provider typically uses a technique called data replication to ensure that the company’s data is always available and accessible. This involves copying the data to multiple servers within the service provider’s network so that if one server goes down, the data can still be accessed from another server.

What is enterprise cloud storage: Types, advantages, disadvantages and more
Enterprise cloud storage allows businesses to access their data from anywhere with an internet connection

In addition to data replication, cloud storage providers also use various security measures to protect the company’s data against unauthorized access and data breaches. These measures can include encryption, authentication, and access control protocols.

When using cloud storage, companies are typically billed on a pay-as-you-go basis, which means that they only pay for the amount of storage capacity they actually use. This makes cloud storage a cost-effective solution for companies that need to store and manage large amounts of data.

How safe is enterprise cloud storage?

Enterprise cloud storage can be very safe when it is used properly. Many enterprise cloud storage providers offer advanced security features, such as encryption and access controls, to help protect businesses’ data from unauthorized access or breaches. Encryption involves the use of complex algorithms to encode data, making it unreadable to anyone who does not have the appropriate decryption key. Access controls, on the other hand, allow businesses to specify which users or groups have permission to access their data and what actions they are allowed to perform. This can help prevent unauthorized access to sensitive data.

In addition to these security features, enterprise cloud storage providers typically use secure data centers to store businesses’ data. These data centers are typically equipped with advanced security measures, such as surveillance cameras, biometric authentication, and physical security guards, to prevent unauthorized access. Additionally, the servers that are used for enterprise cloud storage are typically monitored and managed by experienced IT professionals, who can quickly respond to any security threats or incidents that may arise.


All businesses need an inclusive approach for their data HQ


Overall, the use of enterprise cloud storage can provide businesses with a high level of security for their data. However, it is important for businesses to carefully evaluate the security features and practices of any enterprise cloud storage provider they are considering and to ensure that their own internal policies and procedures are in place to protect their data. This can help ensure that their data is stored and managed securely in the cloud.

Enterprise storage vs cloud storage

Enterprise storage and cloud storage are both types of data storage solutions that are used to store and manage large amounts of data. Enterprise storage refers to data storage solutions that are deployed within a company’s own on-premises data center, while cloud storage refers to data storage solutions that are provided by a third-party service provider and accessed over the internet.

Here are some key differences between enterprise storage and cloud storage:

  • Location: Enterprise storage is located on-premises within a company’s own data center, while cloud storage is located off-premises and is provided by a third-party service provider.
  • Ownership: Enterprise storage is owned and managed by the company itself, while cloud storage is owned and managed by the third-party service provider.
  • Accessibility: Enterprise storage is typically only accessible to employees within the company, while cloud storage is accessible from anywhere with an internet connection.
  • Scalability: Enterprise storage can be difficult to scale up or down as the company’s needs change, while cloud storage is highly scalable and can be easily expanded or contracted as needed.
  • Cost: Enterprise storage can be expensive to implement and maintain, while cloud storage is typically more cost-effective and is billed on a pay-as-you-go basis.
What is enterprise cloud storage: Types, advantages, disadvantages and more
The use of enterprise cloud storage can help reduce the need for on-premises storage infrastructure

Advantages and disadvantages of using enterprise cloud storage

There are several advantages to using enterprise cloud storage, including:

  • Flexibility: Enterprise cloud storage is highly flexible and can be easily scaled up or down as the company’s needs change. This allows companies to quickly adjust their data storage capacity to meet their changing needs without incurring additional costs.
  • Accessibility: Enterprise cloud storage is accessible from anywhere with an internet connection, which means that employees can access the company’s data from any location. This can be particularly useful for companies with a distributed workforce or for employees who need to access data while traveling.
  • Cost-effectiveness: Enterprise cloud storage is typically more cost-effective than on-premises storage solutions. This is because cloud storage is billed on a pay-as-you-go basis, which means that companies only pay for the storage capacity they actually use.
  • Reliability: Enterprise cloud storage is provided by third-party service providers who have the expertise and resources to ensure the reliability and uptime of their data storage solutions. This means that companies can be confident that their data is always available and accessible.
  • Security: Enterprise cloud storage is typically more secure than on-premises storage solutions. This is because cloud storage providers have dedicated teams of security experts who are responsible for protecting the company’s data against unauthorized access, data breaches, and other security threats.

While there are many advantages to using enterprise cloud storage, there are also some potential disadvantages to consider. These include:

  • Dependence on internet connectivity: Enterprise cloud storage relies on internet connectivity in order to access data. This means that if there is an interruption in internet service, employees will be unable to access the company’s data until the issue is resolved.
  • Limited control: When using enterprise cloud storage, companies rely on the service provider to manage and protect their data. This means that they may have less control over their data than they would with an on-premises storage solution.
  • Potential security concerns: While enterprise cloud storage providers take steps to protect the company’s data, there is always a risk that data could be accessed by unauthorized individuals or that data breaches could occur. This means that companies must carefully evaluate the security measures in place before using enterprise cloud storage.
  • Compliance concerns: Some industries, such as healthcare and finance, have strict regulations regarding the storage and handling of sensitive data. Using enterprise cloud storage may make it difficult for companies in these industries to comply with these regulations.
  • Data migration challenges: Migrating large amounts of data from on-premises storage to the cloud can be a complex and time-consuming process. Companies must carefully plan and execute the data migration to ensure that all data is transferred successfully and without disruption to business operations.

Why cloud storage is important for media companies?

Cloud storage is important for media companies for a number of reasons. First and foremost, media companies typically generate and deal with large amounts of data, including audio and video files, images, and other types of multimedia content.

This data can be difficult and expensive to store and manage using traditional on-premises storage solutions. Cloud storage, on the other hand, allows media companies to easily and cost-effectively store and access their data from anywhere with an internet connection.

Additionally, the scalability of cloud storage means that media companies can easily increase their storage capacity as needed without having to invest in additional hardware or infrastructure. Furthermore, the enhanced security features of many cloud storage solutions can help protect media companies’ valuable data from unauthorized access or breaches.

What is enterprise cloud storage: Types, advantages, disadvantages and more
Enterprise cloud storage can provide businesses with increased flexibility and scalability

Best enterprise cloud storage providers

  • Egnyte Business: A mature and complete platform for safe cloud storage and sharing is Egnyte Business. Our Editors’ Choice award goes to this solution because of its centralized file storage features and unwavering dependability.
  • Microsoft OneDrive for BusinessMicrosoft OneDrive for Business expands the types of files it can access, making it an obvious choice for companies with a Microsoft focus. Accessing recently changed and crucial files, folders, and projects are facilitated and accelerated using artificial intelligence.
  • IDrive TeamIDrive Team is a reliable option for cloud backup for small businesses. End-to-end data encryption is one of its many strong features, and it can handle both tiny offices and distributed teams or distributed workers.
  • Citrix Content CollaborationCitrix Content Collaboration is a long-standing, business-oriented cloud storage solution that has been skillfully transformed into a thriving collaboration platform with an emphasis on security.
  • Dropbox BusinessAn outstanding online file storage option for small to midsize organizations has been improved with Dropbox Business. It has features like Smart Sync and Remote Wipe and puts more of an emphasis on teamwork.

Make the right move to get organic SEO clients


Conclusion

In conclusion, enterprise cloud storage is a popular and increasingly essential technology for businesses of all sizes. It allows organizations to store, access, and manage their data and files using remote servers.

However, enterprise cloud storage also has some disadvantages and risks that businesses need to consider. For example, it can be difficult for organizations to maintain control over their data when it is stored in the cloud, and there is always the possibility of data breaches or other security incidents.

Despite these risks, the widespread adoption of enterprise cloud storage is likely to continue, and it has the potential to open up new opportunities for businesses in the future. For example, the ability to easily access and share data from anywhere with an internet connection could lead to increased collaboration and productivity, as well as the development of new, data-driven technologies and services. Overall, enterprise cloud storage is an important technology that is likely to play a key role in the future of doing business.

]]>
https://dataconomy.ru/2022/12/06/enterprise-cloud-storage-types-advantages/feed/ 0
The history of Machine Learning – dates back to the 17th century https://dataconomy.ru/2022/04/27/the-history-of-machine-learning/ https://dataconomy.ru/2022/04/27/the-history-of-machine-learning/#respond Wed, 27 Apr 2022 15:13:51 +0000 https://dataconomy.ru/?p=23534 Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed, and train themselves in unfamiliar environments, goes back to the 17th century. Machine learning is a powerful tool for implementing artificial intelligence technologies. Because of its ability to learn and make decisions, machine […]]]>

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed, and train themselves in unfamiliar environments, goes back to the 17th century.

Machine learning is a powerful tool for implementing artificial intelligence technologies. Because of its ability to learn and make decisions, machine learning is frequently referred to as AI, even though it is technically a subdivision of AI technology. Until the late 1970s, machine learning was only another component of AI’s progress. It then diverged and evolved on its own, as machine learning has emerged as an important function in cloud computing and e-Commerce. ML is a vital enabler in many cutting-edge technology areas of our times. Scientists are currently working on Quantum Machine Learning approaches.

Remembering the basics

Before embarking on our historical adventure that will span several centuries, let’s briefly go over what we know about Machine Learning (ML).

Today, machine learning is an essential component of business and research for many organizations. It employs algorithms and neural network models to help computers get better at performing tasks. Machine learning algorithms create a mathematical model from data – also known as training data – without being specifically programmed.

The brain cell interaction model that underpins modern machine learning is derived from neuroscience. In 1949, psychologist Donald Hebb published The Organization of Behavior, in which he proposed the idea of “endogenous” or “self-generated” learning. However, it took centuries and crazy inventions like the data-storing weaving loom for us to have such a deep understanding of machine learning as Hebb had in ’49. After this date, other developments in the field were also astonishing and even jaw-dropping on some occasions.

The history of Machine Learning

For ages, we, the people, have been attempting to make sense of data, process it to obtain insights, and automate this process as much as possible. And this is why the technology we now call “machine learning” emerged. Now buckle up, and let’s take on an intriguing journey down the history of machine learning to discover how it all began, how it evolved into what it is today, and what the future may hold for this technology.

· 1642 – The invention of the mechanical adder

Blaise Pascal created one of the first mechanical adding machines as an attempt to automate data processing. It employed a mechanism of cogs and wheels, similar to those in odometers and other counting devices.

Pascal was inspired to build a calculator to assist his father, the superintendent of taxes in Rouen, with the time-consuming arithmetic computations he had to do. He created the device to add and subtract two numbers directly and multiply and divide.

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed
The history of machine learning: Here is a mechanical adder or a basic calculator

The calculator had articulated metal wheel dials with the digits 0 through 9 displayed around the circumference of each wheel. The user inserted a stylus into the corresponding space between the spokes and turned the knob until a metal stop at the bottom was reached to input a digit, similar to how a rotary dial on old phone works. The number is displayed in the top left window of the calculator. Then, simply redialed the second number to be added, resulting in the accumulator’s total being displayed. The carry mechanism, which adds one to nine on one dial and carries one to the next, was another feature of this machine.

· 1801 – The invention of the data storage device

When looking at the history of machine learning, there are lots of surprises. Our first encounter was a data storage device. Believe it or not, the first data storage device was, in fact, a weaving loom. The first use of data storage was in a loom created by a French inventor named Joseph-Marie Jacquard, that used metal cards with holes to arrange threads. These cards comprised a program to control the loom and allowed a procedure to be repeated with the same outcome every time.

The history of Machine Learning - dates back to the 17th century
The history of Machine Learning: A Jacquard loom showing information punchcards, National Museum of Scotland

The Jacquard Machine used interchangeable punched cards to weave the cloth in any pattern without human intervention. The punched cards were used by Charles Babbage, the famous English inventor, as an input-output medium for his theoretical, analytical engine and by Herman Hollerith to feed data to his census machine. They were also utilized to input data into digital computers, but they have been superseded by electronic equipment.

· 1847 – The introduction of Boolean Logic

In Boolean Logic (also known as Boolean Algebra), all values are either True or False. These true and false values are employed to check the conditions that selection and iteration rely on. This is how Boolean operators work. George Boole created AND, OR, and NOR operators using this logic, responding to questions about true or false, yes or no, and binary 1s and 0s. These operators are still used in web searches today.

Boolean algebra is introduced in artificial intelligence to address some of the problems associated with machine learning. One of the main disadvantages of this discipline is that machine-learning algorithms are black boxes, which means we don’t know a lot about how they autonomously operate. Random forest and decision trees are examples of machine learning algorithms that can describe the functioning of a system, but they don’t always provide excellent results. Boolean algebra is used to overcome this limitation. Boolean algebra has been used in machine learning to produce sets of understandable rules that can achieve quite good performance.

After reading the history of machine learning, you might want to check out 75 Big Data terms everyone should know.

· 1890 – The Hollerith Machine took on statistical calculations

Herman Hollerith developed the first combined mechanical calculation and punch-card system to compute statistics from millions of individuals efficiently. It was an electromechanical machine built to assist in summarizing data stored on punched cards.

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed
The history of machine learning: Statistical calculations were first made with electromechanical machines

The 1890 census in the United States took eight years to complete. Because the Constitution requires a census every ten years, a larger workforce was necessary to expedite the process. The tabulating machine was created to aid in processing 1890 Census data. Later versions were widely used in commercial accounting and inventory management applications. It gave rise to a class of machines known as unit record equipment and the data processing industry.

· 1943 – The first mathematical model of a biological neuron presented

The scientific article “A Logical Calculus of the Ideas Immanent in Nervous Activity,” published by Walter Pitts and Warren McCulloch, introduced the first mathematical model of neural networks. For many, that paper was the real starting point for the modern discipline of machine learning, which led the way for deep learning and quantum machine learning.

McCulloch and Pitts’s 1948 paper built on Alan Turing’s “On Computable Numbers” to provide a means for describing brain activities in general terms, demonstrating that basic components linked in a neural network might have enormous computational capability. Until the ideas were applied by John von Neuman, the architect of modern computing, Norbert Wiene, and others, the paper received little attention.

· 1949 – Hebb successfully related behavior to neural networks and brain activity

In 1949, Canadian psychologist Donald O. Hebb, then a lecturer at McGill University, published The Organization of Behavior: A Neuropsychological Theory. This was the first time that a physiological learning rule for synaptic change had been made explicit in print and became known as the “Hebb synapse.” 

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed
The history of machine learning: Neural networks are used in many AI systems today

McCulloch and Pitts developed cell assembly theory in their 1951 paper. McCulloch and Pitts’ model was later known as Hebbian theory, Hebb’s rule, Hebb’s postulate, and cell assembly theory. Models that follow this idea are said to exhibit “Hebbian learning.” As stated in the book: “When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”

Hebb’s model paved the way for the development of computational machines that replicated natural neurological processes

Hebb referred to the combination of neurons that may be regarded as a single processing unit as “cell assemblies.” And their connection mix determined the brain’s change in response to stimuli.

Hebb’s model for the functioning of the mind has had a significant influence on how psychologists view stimulus processing in mind. It also paved the way for the development of computational machines that replicated natural neurological processes, such as machine learning. While chemical transmission became the major form of synaptic transmission in the nervous system, modern artificial neural networks are still built on the foundation of electrical signals traveling through wires that Hebbian theory was created around.

·  1950 – Turing found a way to measure the thinking capabilities of machines

The Turing Test is a test of artificial intelligence (AI) for determining whether or not a computer thinks like a human. The term “Turing Test” derives from Alan Turing, an English computer scientist, cryptanalyst, mathematician, and theoretical biologist who invented the test.

It is impossible to define intelligence in a machine, according to Turing. If a computer can mimic human responses under specific circumstances, it may be said to have artificial intelligence. The original Turing Test requires three physically separated terminals from one another. One terminal is controlled by a computer, while humans use the other two.

The history of Machine Learning - dates back to the 17th century
The history of Machine Learning: The IBM 700 series made scientific calculations and commercial operations easier, but the machines also provided the world with some entertainment (Image courtesy of IBM)

During the experiment, one of the humans serves as the questioner, with the second human and computer as respondents. The questioner asks questions of the respondents in a specific area of study within a specified format and context. After a determined duration or number of queries, the questioner is invited to select which respondent was real and which was artificial. The test is carried out numerous times. The computer is called “artificial intelligence” if the inquirer confirms the correct outcome in half of the test runs or fewer.

The test was named after Alan Turing, who pioneered machine learning during the 40s and 50s. In 1950, Turing published a “Computing Machinery and Intelligence” paper to outline the test.

· 1952 – The first computer learning program was developed at IBM

Arthur Samuel’s Checkers program, which was created for play on the IBM 701, was shown to the public for the first time on television on February 24, 1956. Robert Nealey, a self-described checkers master, played the game on an IBM 7094 computer in 1962. The computer won. The Samuel Checkers program lost other games to Nealey. However, it was still regarded as a milestone for artificial intelligence and provided the public with an example of the abilities of an electronic computer in the early 1960s.

The more the program played, learning which moves made up winning strategies in a ‘supervised learning mode,’ and incorporating them into its algorithm, the better it performed at the game.

Samuel’s program was a groundbreaking story for the time. Computers could beat checkers for the first time. Electronic creations were challenging humanity’s intellectual advantage. To the technology-illiterate public of 1962, this was a significant event. It established the groundwork for machines to do other intelligent tasks better than humans. And people started to think; will computers surpass humans in intelligence? After all, computers were only around for a few years back then, and the artificial intelligence field was still in its infancy…

Moving on in the history of machine learning, you might also want to check out Machine learning engineering: The science of building reliable AI systems.

· 1958 – The Perceptron was designed

In July 1958, the United States Office of Naval Research unveiled a remarkable invention: The perception. An IBM 704 – a 5-ton computer size of a room, was fed a series of punch cards and, after 50 tries, learned to identify cards with markings on the left from markings on the right.

According to its inventor, Frank Rosenblatt, it was a show of the “perceptron,” which was “the first machine capable of generating an original thought,” according to its inventor, Frank Rosenblatt.

“Stories about the creation of machines having human qualities have long been a fascinating province in the realm of science fiction,” Rosenblatt observed in 1958. “Yet we are about to witness the birth of such a machine – a machine capable of perceiving, recognizing, and identifying its surroundings without any human training or control.”

He was right about his vision, but it took almost half a decade to provide it.

· The 60s – Bell Labs’ attempt to teach machines how to read

The term “deep learning” was inspired by a report from the late 1960s describing how scientists at Bell Labs were attempting to teach computers to read English text. The invention of artificial intelligence, or “AI,” in the early 1950s began the trend toward what is now known as machine learning.

· 1967 – Machines gained the ability to recognize patterns 

The “nearest neighbor” algorithm was created, allowing computers to conduct rudimentary pattern detection. When the program was given a new object, it compared it to the existing data and classified it as the nearest neighbor, which meant the most similar item in memory.

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed
The history of machine learning: Pattern recognition is the basis of many AI developments achieved till now

The invention of the pattern recognition algorithm is credited to Fix and Hodges, who detailed their non-parametric technique for pattern classification in 1951 in an unpublished issue of a US Air Force School of Aviation Medicine report. The k-nearest neighbor rule was initially introduced by Fix and Hodges as a non-parametric method for pattern classification.

· 1979 – One of the first autonomous vehicles was invented at Stanford

The Stanford Cart was a decades-long endeavor that evolved in various forms from 1960 to 1980. It began as a study of what it would be like to operate a lunar rover from Earth and was eventually revitalized as an autonomous vehicle. On its own, the student invention cart could maneuver around obstacles in a room. The Stanford Cart was initially a remote-controlled television-equipped mobile robot.

The history of Machine Learning - dates back to the 17th century
The history of Machine Learning: The infamous Stanford Cart (Image courtesy of Stanford University)

A computer program was created to control the Cart through chaotic locations, obtaining all of its information about the world from on-board TV images. The Cart used a variety of stereopsis to discover things in three dimensions and determine its own motion. Based on a model created with this data, it planned an obstacle-avoiding route to the target destination. As the Cart encountered new obstacles on its trip, the plan evolved.

We are talking about the history of machine learning, but data science is also advanced today in many areas. Here are a couple interesting articles we prepared before:

· 1981 – Explanation based learning prompt to supervised learning

Gerald Dejong pioneered explanation-based learning (EBL) in a journal article published in 1981. EBL laid the foundation of modern supervised learning because training examples supplement prior knowledge of the world. The program analyzes the training data and eliminates unneeded information to create a broad rule applied to future instances. For example, if the software is instructed to concentrate on the queen in chess, it will discard all non-immediate-effect pieces.

· The 90s – Emergence of various machine learning applications 

Scientists began to apply machine learning in data mining, adaptive software, web applications, text learning, and language learning in the 1990s. Scientists create computer programs that can analyze massive amounts of data and draw conclusions or learn from the findings. The term “Machine Learning” was coined as scientists were finally able to develop software in such a way that it could learn and improve on its own, requiring no human input.

· The Millennium – The rise of adaptive programming

The new millennium saw an unprecedented boom in adaptive programming. Machine learning went hand to hand with adaptive solutions for a long time. These programs can identify patterns, learn from experience, and improve themselves based on the feedback they receive from the environment.

Deep learning is an example of adaptive programming, where algorithms can “see” and distinguish objects in pictures and videos, which was the underlying technology behind Amazon GO shops. Customers are charged as they walk out without having to stand in line.

amazon-go
The history of Machine Learning: Amazon GO shops charge customers as they walk out without standing in line (Image courtesy of Amazon)

· Today – Machine learning is a valuable tool for all industries

Machine learning is one of today’s cutting-edge technologies that has aided us in improving not just industrial and professional procedures but also day-to-day life. This branch of machine learning uses statistical methods to create intelligent computer systems capable of learning from data sources accessible to it.

Contrary to popular belief, the history of machine learning, which enables machines to learn tasks for which they are not specifically programmed
The history of machine learning: Medical diagnosis is one area that ML will change soon

Machine learning is already being utilized in various areas and sectors. Medical diagnosis, image processing, prediction, classification, learning association, and regression are just a few applications. Machine learning algorithms are capable of learning from previous experiences or historical data. Machine learning programs use the experience to produce outcomes.

Organizations use machine learning to gain insight into consumer trends and operational patterns, as well as the creation of new products. Many of today’s top businesses incorporate machine learning into their daily operations. For many businesses, machine learning has become a significant competitive differentiator. In fact, machine learning engineering is a rising area.

· Tomorrow – The future of Machine Learning: Chasing the quantum advantage

Actually, our article was supposed to end here, since we came to today in the history of machine learning, but it doesn’t, because tomorrow holds more…

For example, Quantum Machine Learning (QML) is a young theoretical field investigating the interaction between quantum computing and machine learning methods. Quantum computing has recently been shown to have advantages for machine learning in several experiments. The overall objective of Quantum Machine Learning is to make things move faster by combining what we know about quantum computing with conventional machine learning. The idea of Quantum Machine Learning is derived from classical Machine Learning theory and interpreted in that light.

The application of quantum computers in the real world has advanced rapidly during the last decade, with the potential benefit becoming more apparent. One important area of research is how quantum computers may affect machine learning. It’s recently been demonstrated experimentally that quantum computers can solve problems with complex correlations between inputs that are difficult for traditional systems.

According to Google’s research, quantum computers may be more beneficial in certain applications. Quantum models generated on quantum computing machines might be far more potent for particular tasks, allowing for quicker processing and generalization on fewer data. As a result, it’s crucial to figure out when such a quantum edge can be exploited…

]]>
https://dataconomy.ru/2022/04/27/the-history-of-machine-learning/feed/ 0
Bottomless Storage and Pipeline: The Quest for a New Database Paradigm https://dataconomy.ru/2022/04/20/bottomless-storage-pipeline-new-database-paradigm/ https://dataconomy.ru/2022/04/20/bottomless-storage-pipeline-new-database-paradigm/#respond Wed, 20 Apr 2022 15:04:39 +0000 https://dataconomy.ru/?p=23284 The amount of data we create is increasing by the hour, which has resulted in organizations struggling to deal with data accumulation and analysis. Things can get chaotic pretty quickly with IoT devices, applications, manual entry, and many other sources constantly generating data with different or no structures. Anyone who has had to deal with […]]]>

The amount of data we create is increasing by the hour, which has resulted in organizations struggling to deal with data accumulation and analysis. Things can get chaotic pretty quickly with IoT devices, applications, manual entry, and many other sources constantly generating data with different or no structures.

Anyone who has had to deal with data knows that good data architecture is crucial for the correct functioning of any system. No matter how much data is being dealt with, implementing the right models, policies, and standards will directly impact how successfully information is used from the moment it is captured to the decision-making process.

Databases: The Heart and Soul of Data Architecture

When it comes to dealing with data, file systems have long been the preferred tool to deal with data storage, where databases are the preference for querying and using that data operationally. Unfortunately, legacy database models have struggled to keep up with the increasing need for real-time data ingestion, immediate and low-latency queries on that real-time data along with historical data, and handle increased demands from a growing user base for quick access through interactive cloud-native applications, SaaS and mobile apps, and APIs. The industry’s response has been the creation of highly specialized database engines, which break this workload challenge into parts to that each of these can have the speed and scale required. The unintended consequence has been an increase in the complexity of these applications due to the multiple underlying database technologies which are stitched together to serve as a data system for an application. 

With applications becoming increasingly reliant on connecting to multiple databases, the existing overspecialization had become a significant problem, eroding its initially offered value. Unfortunately, the shift to cloud-native architectures and the growing demand for more efficient data management is not going anywhere, meaning that if the paradigm doesn’t change, problems will ensue.

SingleStore’s Approach

The developers of database management systems are aware of the problems plaguing the industry, so most of them are looking to find new ways to move away from nuanced, specialty databases. Take SingleStore as an example. The company aims to harvest the benefits of elastic cloud infrastructure to create an integrated and scalable database that supports multiple types of applications.

With this goal in mind, SingleStore has designed multiple features to change how businesses access and use their data. These range from using Pipelines to ingest data from any source to the distribution of the storage and the execution of queries.

By developing a distributed framework for the creation, upkeep, and use of databases, SingleStore has built a new paradigm for database management.

Using Distributed Architecture to Improve Performance

SingleStore distributes its databases across many machines. By distributing the load in this way, SingleStore seeks to put performance first while facilitating online database operations and providing powerful cluster scaling. As single points of failure are removed by adding redundancy, any data in the database is accessible at all times.

Continuous Data Loading with Pipelines

SingleStore also uses its native Pipelines feature to allow users to perform real-time analytical queries and real-time data ingestion. By providing easy continuous loading, scalability, easy debugging, and high performance, Pipelines effectively acts as a more than valid alternative to ETL middleware. 

The fact that popular data sources and formats are also supported makes the feature easy to integrate. These include:

  • Data sources: Apache Kafka, Amazon S3, Azure Blob, file system, Google Cloud Storage, and HDFS data sources.
  • Data formats: JSON, Avro, Parquet, and CSV data formats.

PIpelines can be easily backed up and used to restore a state at any given point, which further adds to the stability of any database.

Bottomless Storage for Additional Durability

In addition to dividing the load between nodes and facilitating real-time data ingestion through Pipelines, SingleStore has also developed a great way to separate storage and computing: Unlimited Storage. With Bottomless, long-term storage is moved to blob storage while the most recent data is kept in the SingleStore cluster, resulting in higher availability and flexibility.

Some of the benefits of this approach are flexibility when scaling up and down, allowing for the addition of reading replicas, low recovery time objectives, and read-only point-in-time recovery. 

Distributed Infrastructure is the Future

Distributed technology has become increasingly relevant over the past years. Blockchain, distributed ledgers, distributed computing, P2P apps, and many more use cases have caught the attention of investors worldwide.

In the case of SingleStore, its approach has been attractive enough to raise over $318 million in funding from names like Khosla Ventures, Accel, Google Ventures, Dell Capital, and HPE. What started with a $5 series A round back in 2013 grew to raise $80 million in Series F funding in 2021. 

This success has also seen the platform’s user base grow and the industry is recognizing its contributions. Such recognitions include being listed in Deloitte’s “2021 Technology Fast 500™ Rankings”, San Francisco Business Times’ “Fast 100”, and INC’s “5000”  2020 awards.

A few months ago, Gartner also added SingleStore to its Magic Quadrant for Cloud Database Management Systems, one of the most trusted reports in the industry.

]]>
https://dataconomy.ru/2022/04/20/bottomless-storage-pipeline-new-database-paradigm/feed/ 0
Storage for video surveillance: keep it simple https://dataconomy.ru/2021/08/31/storage-video-surveillance-keep-it-simple/ https://dataconomy.ru/2021/08/31/storage-video-surveillance-keep-it-simple/#respond Tue, 31 Aug 2021 12:30:59 +0000 https://dataconomy.ru/?p=22270 This year a significant event will take place: somewhere in the world, the billionth CCTV camera will be installed. This means that a camera already monitors every seventh person on the planet. And in some cities, more than a million cameras are already in use, making the ratio even more impressive. That’s a great deal […]]]>

This year a significant event will take place: somewhere in the world, the billionth CCTV camera will be installed. This means that a camera already monitors every seventh person on the planet. And in some cities, more than a million cameras are already in use, making the ratio even more impressive.

That’s a great deal of surveillance. But cameras are used for more than just security. They also help businesses ensure quality control of processes, improve logistics, get better product placement, recognize privileged customers the moment they enter the sales area, and so on.

Storage for video surveillance: keep it simple

RAIDIX sees the usage of video analytics tools for enterprise tasks as an appealing challenge, so they have developed a line of solutions based on:

  • scalable video archive with zero point of failure architecture and the most reliable RAID in the industry;
  • high-performance storage system, which will significantly increase the speed of training models;
  • high-performance solutions for edge infrastructures;
  • mini-hyperconverged solution.

RAIDIX offers three types of solutions that can be used in high-performance infrastructures:

  • centralized solution based on high-performance RAIDIX ERA engine, NVMe drives and high-performance network from NVIDIA:
Storage for video surveillance: keep it simple
AFA based on AIC HA202-PV platform
Storage for video surveillance: keep it simple
AFA based on Supermicro server platform and Western Digital EBOF 
  • a centralized solution for creating video archives that provide the highest access speed and availability of large amounts of data:
Storage for video surveillance: keep it simple
A basic scheme of a video archive 
Storage for video surveillance: keep it simple
Data Storage System based on Supermicro server platform and Western Digital EBOF 
  • RAIDIX ERA-based solution for edge infrastructures:
Storage for video surveillance: keep it simple
  • mini-hyperconverged platform for smaller projects:
Storage for video surveillance: keep it simple
Storage for video surveillance: keep it simple

Below there is a closer look at implementing a video archive in modern installations.

Industry Challenges and Storage Requirements

Video surveillance projects face new challenges at the data storage level. These are not only large requirements for bandwidth and storage capacity but there are also changes in the type of load on the storage system.

Now, most of the workload falls on these tasks and processes:

  • continuous random write operations from multiple cameras and video servers;
  • unpredictable random read operations of the video archive on demand;
  • high transactional load on databases;
  • high-speed work with memory for analytics.

In addition to managing the variety and intensity of these storage workloads, scalability is critical to accommodating new cameras and continually increasing resolutions. Also, to meet the growing needs of video surveillance, companies need high-performance, reliable, and efficient storage systems.

Solution: NAS and…?

Large video surveillance projects go well beyond network video recorders and storage on video surveillance servers.

Modern VSS requires an enterprise-grade infrastructure with separate servers and storage units. The layered approach allows for increased processing power, faster I / O processing, and increased throughput and capacity.

With these requirements in mind, enterprise storage systems are dominated by two architectures:

  • NAS: stores data as files and presents these files to the application as a network folder;
  • SAN: looks like local storage, allowing the operating system to manage the disk.

In the context of video surveillance applications, these two approaches are polar.

Recently, SAN has become the preferred option for enterprise VSS. Sure, NAS technology does a good job for many tasks, but multi-camera, database, and analytics recording workloads require performance that requires a direct connection or SAN approach. IHS forecasts show that the SAN market will grow by more than 15% in 2020-2022, while the NAS segment’s annual growth will drop from 5% to about 2%.

For this reason, video surveillance software vendors recommend local or SAN-attached storage.

Also, many video surveillance projects operate in virtual environments. In these cases, each virtual video surveillance server requires high-performance storage not only for its video content, but also for the operating system, applications, and databases.

Make it VSS (Viable Simple Storage) 

Clearly, both SAN and NAS are easy to use, and the deployment steps applying to them are almost the same since both architectures may require Ethernet-based connectivity (although SANs can use other media such as FC) so that files and directories can be accessed from multiple systems. These solutions should use file locking to prevent multiple systems from modifying files at the same time.

Since many video surveillance systems do not require common video sharing, all this file locking and the complexity of the shared file system is unnecessary overhead that limits performance and adds complexity to maintenance and protection.

Deduplication and compression, also offered by many NAS and SAN systems, are unnecessary for video surveillance solutions. Choosing a solution with these features incurs additional costs for unused technologies. These useless features built into the software negatively impact overall performance and require maintenance to ensure safety and reliability.

Storing data at different levels can be useful when deploying video surveillance. However, video surveillance software already knows how to manage this, as it can create separate storage for databases, real-time recording, and archives. As long as the data is managed by video surveillance software, there is no need for storage in the storage system to move data between tiers dynamically. Consequently, data tiering or automated management is not required as a storage function and also increases risks and complexity.

Why SAN is effective

Most scalable file systems require multiple servers for their functioning. Solutions with multiple servers, in their turn, require an internal network, which can create the following problems:

  • Each write operation creates a series of data transfers over the internal network, which limits performance;
  • peer-to-peer connections create more potential points of failure, which can make it harder to increase storage or replace equipment;
  • while achieving the same redundancy levels as the SAN, scalable file systems provide less bandwidth.

SAN solutions for VSS are also offered by RAIDIX. These solutions are based on software RAID, capable of performing checksum calculations faster than any similar solution in the industry. Also, RAIDIX supports various SAN protocols (iSCSI, FC, iSER, SRP), which help to achieve a number of goals:

  • providing high bandwidth (up to 22GB / s) to work with thousands of high-resolution cameras that can be connected through dozens of video servers;
  • cost-effective maintenance with an increase in the number of cameras and in archive depth: due to the use of proprietary RAID-array technologies, fewer disks are required to obtain the required storage volume and performance;
  • vertical scalability up to 11PB per storage system due to the ability to work with large RAID groups of up to 64 disks and provide failover for two or more disks (when using RAID 7.3 / N + M), as well as combining these groups into a single volume;
  • high reliability of data storage when using RAID 7.3 or RAID N + M, the most fault-tolerant RAID-arrays on the market, which makes possible the use of large disks (up to 18-20TB) without compromising data safety. With an increase in the volume of disks and their number in a RAID array, the likelihood of data loss increases sharply, as the reliability of the disks decreases as well. So, the probability of data loss for RAID6 of 24 18TB disks after one year in operation is 1%, while for RAID 7.3 it is only 0.001%;
  • stability of operation during sudden increases in workload due to sufficient performance headroom, even in situations where drive failure coincides with peaks of intensive work of the video surveillance system. This is achieved thanks to unique technologies of proactive and partial reconstruction;
  • the high performance of RAIDIX storage system does not limit the capabilities of analytical software for video surveillance. Face recognition, motion capture, and other video analytics functions will work without downtime and with minimal latency;
  • the possibility of using the obtained video surveillance data simultaneously not only in security tasks, but also in business tasks for carrying out various analytics. It does not require additional copying operations to analytical systems, while the use of smart prioritization due to QoSmic technology allows users to avoid the influence of additional storage tasks on the main recording function;
  • building an enterprise-level architecture without a single point of failure: RAIDIX 5.X supports dual-controller operation with possible replication to remote systems.

Where to start choosing an archive storage system?

When calculating and selecting an archived data storage system, the following parameters should be considered:

  • type of cameras and their number;
  • archive depth in days;
  • additional retention requirements (if any);
  • the intensity of movement in the frame, its distribution over the time of day or depending on events;
  • type of network infrastructure, its need for updates;
  • how the video analytics software is deployed;
  • whether it is required to use the resources of cloud infrastructures;
  • when and what kind of upgrade is expected (type and number of cameras, list of services, depth of the archive, etc.)

For a basic calculation, one can use the calculators available at specialized software vendors’ websites. For a more accurate calculation in complex projects, the participation of professionals will be required.

In addition, there are two important points to consider when calculating.

Firstly, calculating desirable characteristics of data storage systems should be carried out with the worst-case scenario in mind: the maximum load in case of failure of storage system components, controllers, and drives. Unfortunately, this is what usually happens in real life: with an increase in the load, physical components begin to fail as their capabilities reach the limit.

Secondly, the volume of drives is gradually increasing, but their performance is still the same, and classic RAIDs simply cannot make it. We need technologies that will ensure the availability of large data volumes over the long term. However, with the mass adaptation of the two actuarial accumulators, this will soon change.

Thus, the elements of a modern video archive are:

  • large volume drives (16-18TB);
Storage for video surveillance: keep it simple
  • two or more controllers;
Storage for video surveillance: keep it simple
  • high-performance access interfaces (FC> 16GBps, Eth> 10GBps);
  • controller software that allows easy scaling of the volume without service downtime, makes it possible to survive the failure of multiple drives without losing performance and at least one storage controller, and is also adapted to continuous recording.

Conclusion

The demand for video surveillance projects is steadily growing and entails demand for solutions that create fault-tolerant storage systems. The two main approaches to media storage that the enterprise segment is targeting are NAS and SAN. The second type of configuration seems to be more optimal for video surveillance projects because of its higher performance, the ability to function in different environments, and the use of a large number of servers. For customers looking for high performance and fault tolerance, RAIDIX provides advanced SAN storage solutions based on fast software RAID.

In general, modern data storage systems provide a great number of options, and the user’s task is to determine what is important and to avoid overpaying and bringing unnecessary loads on the system. For example, video surveillance does not actually require storage tiering or automatic management as a storage function. At the same time, this does not mean that the choice of data storage systems should be a piece of cake: there are about a dozen software and hardware-related factors you should pay attention to. Also, when calculating performance indicators and fault tolerance of future storage systems, one should always focus on the worst possible scenario, which is the maximum load in case of storage system components’ failure.

]]>
https://dataconomy.ru/2021/08/31/storage-video-surveillance-keep-it-simple/feed/ 0
Cloudian Now Tackling Storage-Specific Use Cases, Starting with Big Data & Hadoop https://dataconomy.ru/2015/01/19/cloudian-now-tackling-storage-specific-use-cases-starting-with-big-data-hadoop/ https://dataconomy.ru/2015/01/19/cloudian-now-tackling-storage-specific-use-cases-starting-with-big-data-hadoop/#respond Mon, 19 Jan 2015 15:28:37 +0000 https://dataconomy.ru/?p=11495 Cloudian, the hybrid cloud storage provider, has rolled out its latest offering – the Cloudian HyperStore 5.1 software, last Thursday. An enhanced Amazon S3-compliant, the HyperStore 5.1 is a plug-and-play hybrid cloud software accommodates full Apache Hadoop integration. “To store data on a smart system like HyperStore, and run Hadoop on top of it, offers […]]]>

Cloudian, the hybrid cloud storage provider, has rolled out its latest offering – the Cloudian HyperStore 5.1 software, last Thursday. An enhanced Amazon S3-compliant, the HyperStore 5.1 is a plug-and-play hybrid cloud software accommodates full Apache Hadoop integration.

“To store data on a smart system like HyperStore, and run Hadoop on top of it, offers huge business advantages to enterprises in terms of scalability, efficiency and cost savings,” enunciates  the chief marketing officer at Cloudian, Paul Turner.

“From the financial industry to medical research, there is no shortage of markets that will benefit from turning big data into smart data in pursuit of realizing market- and revenue-shifting insights,” he added.

The new software allows customer enterprises to run Hadoop analytics on HyperStore software and appliances. The in-place analytics, requires no offloading of data to other systems for Hadoop analyses, providing enterprises with ‘meaningful; business intelligence from their data ‘quickly, efficiently and economically,’ explains their news release.

Meanwhile, Cloudian has completed testing for Hortonworks’ Certification to join the Hortonworks Technology Partner Program, thus allowing HyperStore customers to have access to Hortonworks Data Platform (HDP).

Commenting on the development, VP Strategic Marketing at Hortonworks, John Kreisa said: “HyperStore 5.1 helps further Hadoop and its YARN architecture for the enterprise, offering the open source technology community robust, best-in-class data storage with built-in, real time data analytics.”

Salient features of HyperStore 5.1 include:

  • Federated geo-replication, which enables real-time access to your organization’s metadata across multiple global sites;
  • An improved management console and user interface for unified monitoring, usable visibility and cost governance, archiving, search and discovery of on-premises and cloud-based data;
  • New security features that support use of the end user’s credentials in the cloud;
  • OpenStack ‘Icehouse’ support
  • HyperStore 5.1 is available now as a free upgrade to existing HyperStore customers. For all others, it is available for immediate purchase from Cloudian and through the Cloudian partner network.

Read more here.


(Image credit: Cloudian)

]]>
https://dataconomy.ru/2015/01/19/cloudian-now-tackling-storage-specific-use-cases-starting-with-big-data-hadoop/feed/ 0
Front Porch Digital’s Big Data Technologies Now Part of Oracle’s Comprehensive Storage Portfolio https://dataconomy.ru/2014/09/16/front-porch-digitals-big-data-technologies-now-part-of-oracles-comprehensive-storage-portfolio/ https://dataconomy.ru/2014/09/16/front-porch-digitals-big-data-technologies-now-part-of-oracles-comprehensive-storage-portfolio/#respond Tue, 16 Sep 2014 08:09:31 +0000 https://dataconomy.ru/?p=9190 Digital asset management innovator Front Porch Digital have inked a deal to be acquired by Oracle. The purchase will allow both the companies to help clients efficiently manage the growing complexities involved in migration, integration, storage, and delivery of rich media content, reports a press release announcing the acquisition. John Fowler, the Executive Vice President […]]]>

Digital asset management innovator Front Porch Digital have inked a deal to be acquired by Oracle.

The purchase will allow both the companies to help clients efficiently manage the growing complexities involved in migration, integration, storage, and delivery of rich media content, reports a press release announcing the acquisition.

John Fowler, the Executive Vice President of Oracle Systems said in this regard, “Organizations need a modern, integrated content storage management solution to manage and monetize their valuable rich media assets. We will continue to build on Front Porch Digital’s success and unique capabilities, which complement Oracle’s existing high performance and scalable engineered storage solutions.”

Upwards of 550 organizations globally, including A&E Television, BBC, Discovery Communications, U.S. Library of Congress and NASCAR, are assisted by Front Porch Digital to ensure the availability and accessibility of their valuable content. In total, Front Porch Digital manages over 750 petabytes of digital content, reports the statement.

“Front Porch Digital has developed industry leading solutions that help companies manage large-scale digital content,” added Mike Knaisch, the CEO of Front Porch Digital. “We are thrilled to be joining Oracle to continue our long-standing partnership. This combination will enable us to better serve and support our customers at a global scale.”

The terms of the agreement remain undisclosed.

Read more here.


(Image credit: Oracle)

]]>
https://dataconomy.ru/2014/09/16/front-porch-digitals-big-data-technologies-now-part-of-oracles-comprehensive-storage-portfolio/feed/ 0
Stealthy DataGravity’s Discovery Deploys Analytics to Make Storage Intelligent https://dataconomy.ru/2014/08/20/stealthy-datagravitys-discovery-deploys-analytics-to-make-storage-intelligent/ https://dataconomy.ru/2014/08/20/stealthy-datagravitys-discovery-deploys-analytics-to-make-storage-intelligent/#respond Wed, 20 Aug 2014 08:13:18 +0000 https://dataconomy.ru/?p=8528 Up until this point, startup DataGravity have been rather secretive about the data-aware solutions they’re developing. But now they’ve released their Discovery Series, following a successful beta period. The Discovery Series is initially offered in two models, the DG2200 and DG2400, with 48TB and 96TB capacity levels, which essentially applies integrated search and analytics to […]]]>

Up until this point, startup DataGravity have been rather secretive about the data-aware solutions they’re developing. But now they’ve released their Discovery Series, following a successful beta period. The Discovery Series is initially offered in two models, the DG2200 and DG2400, with 48TB and 96TB capacity levels, which essentially applies integrated search and analytics to storage so it’s no longer a black box.

IDC program vice president of storage Laura Dubois said the unstructured data dilemma is growing. “IDC has been predicting technology would catch up to provide an answer to the market demand,” she said. “The DataGravity approach is transformational in an industry where innovation has been mostly incremental. DataGravity data-aware storage can tell you about the data it’s holding, making the embedded value of stored information accessible to customers who cannot otherwise support the cost and complexity of solutions available today.”

More than 80 percent of the data being produced today is unstructured, according to a recent Gartner report. DataGravity explains that companies are struggling to maintain their rapidly growing stores of human-generated data. What’s more, enterprises are spending vast amounts of money on storage infrastructure, layered management applications and siloed processes that only introduce greater complexities without offering any real-time insights or benefits.

DataGravity intends to build on this gap with a unified storage platform that offers insights with the same richness of intelligence, regardless of whether the data is block or file. It supports NFS, CIFS/SMB and iSCSI LUNs, with the additional capability to manage virtual machines natively.

Long-time storage industry veteran and co-founder of EqualLogic Paula Long said,“We’ve integrated data analytics into storage — as data is ingested we capture who’s reading and writing it using Active Directory or LDAP. We capture who’s interacting with the data on the front end, we provide audit and activity trail, and on the backend we index over 400 data types.”

It offers fully visual view of the data and ways you can dig into it and surface information that typically was hidden inside storage containers in the past or required third party tools to view are now easily available.

The Discovery Series array has features that include information such as the top users and dormant data on a particular share or getting a facet filter that lets the user check different facets related to the type of data such as readers, owners and file types allowing zooming in or out of the data to meet particular needs. Each file put into the storage array is opened and indexed automatically.

It also uses industry standard algorithms to find personally identifiable information (PII) such as social security and credit card numbers. And as with any information in the DataGravity system, you can click any PII and learn more about it and then filter it by relevant facets.

“We believe that storage, provisioning, deduplication and in-line compression have become standard features (in storage),” Long, the startup’s CEO, said. “The features we’re talking about will be the next thing people ask for in their RFPs. Storage can’t be just a container for much longer.”

Read more here.


(Image credit: Datagravity)

]]>
https://dataconomy.ru/2014/08/20/stealthy-datagravitys-discovery-deploys-analytics-to-make-storage-intelligent/feed/ 0
Cloud-Integrated Storage Startup Nasuni Picks Up $10m to Enhance Outreach, Sales and Marketing https://dataconomy.ru/2014/08/18/cloud-integrated-storage-startup-nasuni-picks-up-10m-to-enhance-outreach-sales-and-marketing/ https://dataconomy.ru/2014/08/18/cloud-integrated-storage-startup-nasuni-picks-up-10m-to-enhance-outreach-sales-and-marketing/#respond Mon, 18 Aug 2014 08:28:51 +0000 https://dataconomy.ru/?p=8463 Nasuni, a cloud-integrated storage provider for the global enterprise that combines on-premise hardware with cloud storage, has struck a $10 million round of venture capital funding. Early investors like Flybridge Capital Partners, North Bridge Venture Partners and Sigma Partners, were joined by newer investors as they participated in this extension of the company’s C round. […]]]>

Nasuni, a cloud-integrated storage provider for the global enterprise that combines on-premise hardware with cloud storage, has struck a $10 million round of venture capital funding. Early investors like Flybridge Capital Partners, North Bridge Venture Partners and Sigma Partners, were joined by newer investors as they participated in this extension of the company’s C round.

“Simply put, we wanted a bigger share of Nasuni,” said Paul Flanagan, managing director at Sigma Partners. “With their disruptive technology and approach to delivering enterprise storage as a service, Nasuni is revolutionizing the way data storage is deployed. Clearly, we’re excited about the company and have been super impressed with its growth. The opportunity here is enormous, and Nasuni is perfectly positioned to take full advantage of IT’s shift to the cloud.”

Andres Rodriguez, founder and CEO explains, “The Nasuni Service liberates data from the limitations and high cost of traditional storage silos. With this new financing, we will expand our outreach and accelerate innovation and market adoption.”

Firmly established vendors like EMC and NetApp are probable competitors that Nasuni’s unified storage intends to take on with a patented UniFS Global File System, which gives users fast access to a global file share no matter where they are located.

The Nasuni Service logged a 232-percent increase in bookings in the second quarter of 2014 and 181-percent sales growth. Expected to be a few years away from an IPO, this Boston startup will use the new funds to scale engineering, sales and marketing efforts.

Read more here.


(Image credit: Nasuni)

]]>
https://dataconomy.ru/2014/08/18/cloud-integrated-storage-startup-nasuni-picks-up-10m-to-enhance-outreach-sales-and-marketing/feed/ 0