Trending February 2024 # Utility Computing Infiltrates The Enterprise Storage Sector # Suggested March 2024 # Top 10 Popular

You are reading the article Utility Computing Infiltrates The Enterprise Storage Sector updated in February 2024 on the website Eastwest.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Utility Computing Infiltrates The Enterprise Storage Sector

New concepts that result in change are typically disruptive and costly when they enter the information infrastructure of our lives and businesses. The latest concept of ‘utility computing’ seems to not only be one of the ‘hottest’ buzzwords on the storage conference circuit, but also a concept that will force considerable change. However, even though it is beginning to gain acceptance — and some experts believe that storage is the next logical step — many experts believe that utility computing in the storage sector is at least a few years away from mainstream use.

“Over the past few years, physical storage has become almost as much of a commodity as the water that utilities carry,” says Tom Fredrick, senior vice president at Precise SRM. Fredrick says technologies that allow the dynamic provisioning of storage help to complete the delivery of storage as a utility. “New technologies that provide policy-based management of storage content add a layer of intelligence to the storage — making it more seamless to the user on the other end of the wall jack,” he continues.

On the other hand, Rik Mussman, vice president of technology with Nexsan, believes that utility computing in the storage arena is still years away from mainstream use, although he does point out that utility computing in other areas, such as the remote storage of email and Web hosting, is quite common today and widely accepted.

Diamond Lauffin, senior executive vice president of Nexsan, says that the real question is not whether we have the technology available or the capability to deliver the concept of utility computing in the storage arena; the real question is one of social acceptance. “The United States has already embraced and turned itself into a service-based organizational model, long ago foregoing its position of being a manufacturing dynasty,” he says.

“We see well-accepted examples in the decline of physical home answering machines. Remote voice mail services have supplanted home-based answering machines and allow ease of use and greater mobility. Cell phones have driven this even further. When have you ever picked up your VM on your cell phone from a physical answering device? Have you questioned whether this remote storage would work?”

Page 2: ‘Charge Back’ a Key to Utility Trend

‘Charge Back’ a Key to Utility Trend

One of the key parts of the utility computing concept that many think will drive the trend is the ability to do “charge back,” which allows IT departments to divide the costs of providing servers and storage among a firm’s various business units. The big question is, will the concept of “charging back” accelerate the use of utility computing?

According to Fredrick, charge backs are already a part of IT today; however, he says, many companies just spread the cost across all the departments, with everybody paying into the pool. “This ‘one size fits all’ approach does not work well and sometimes fails to adequately support a particular part of the business,” says Fredrick.

“It will become the business line managers, in conjunction with IT professionals, that will drive new technology adoptions and innovations to continually lower costs,” Fredrick continues. “Charge back systems reward innovation and improve the return on technology assets.”

However, with that said, Lauffin disagrees when it comes to small and medium-sized businesses (SMBs) and says that charge back will not have any bearing on the progress. “Charge back is already an established protocol at most large companies and has been for several years,” he says. “With smaller companies, the concept of charge back is irrelevant in that they typically have no capacity for it,” he elaborates. Mussman agrees and says that the concept of utility computing will probably never be embraced by SMBs because they do not have the financial resources necessary to support it.

Page 3: Managing Storage as a Service

Managing Storage as a Service

Still, is the concept of ‘managing storage’ as a service really a part of the fabric of storage? Fredrick points out that there are two areas of managing storage: The management of the physical storage, which is quickly becoming the enabler for utility computing, and storage content management.

Fredrick contends the physical management of storage will become part of the fabric of storage by introducing dynamic LUN management and alternate path data routing. In addition, he feels that these services are part of the pipes of the utility computing model. “As for content management, this is a service that will become a service that is offered to business line managers to better manage their IT resources,” he says.

Many experts believe that storage will eventually become a service or utility with data ending up in remote vaults, managed at a central point, by a central group. “We will see a greater number of central vaults of data storage, ” says Lauffin. However, he again brings up the issue of social acceptance. “On the way to those central vaults, there will be different generational perceptions standing in the way. Overcoming generational differences has traditionally been a 5-8 year cycle within our industry,” he adds.

With all this talk about utility computing, another question arises and that is: Is the concept of utility computing a business style issue or a technical one? Although many people believe that it is more of a business style issue, others do not see it that way. “IT managers have always looked for ways to demonstrate the value they bring to an organization,” says Fredrick. “Getting the businesses to buy into a differentiated delivery of IT services based on what they are wiling to pay is a new concept that needs to be accepted.” Mussman disagrees and says that it will be easier to get the business side to buy into the concept of utility computing because the IT side will be concerned with losing their jobs if and when companies embrace storage as part of utility computing.

Page 4: What’s Driving the Utility Concept and What’s in Store for It?

What’s Driving the Utility Concept and What’s in Store for It?

Although the concept of utility computing has been a major issue on the conference circuit, what exactly is driving this concept, and is it an “either/or” proposition? Fredrick says that today’s IT environments are so complex that when a problem occurs the finger pointing begins. “It’s the storage, it’s the network, or it’s the application, and nothing gets fixed,” he says. He says that by approaching utility computing as a “service offering,” all the disparate departments will soon realize that they must work together to deliver the service.

As far as utility computing being an “either/or” proposition, Lauffin does not believe that it will fit into that category. “In the transition to this utility paradigm we will see the continued trend for responsible companies to co-locate their data onsite and offsite for greater security and confidence, ” he says. “This may be the pervading model for some time, particularly with large businesses.” Mussman also believes that most companies will start with selected services that meet particular needs, and some customers will even double up as providers, offering resources where they have special skills or capacity.

»

See All Articles by Columnist Leslie Wood

You're reading Utility Computing Infiltrates The Enterprise Storage Sector

Internet Of Things In The Retail Sector

The retail industry is under immense pressure to keep up with the ever-changing landscape of technology. To stay ahead of the curve, retailers must be able to adopt new technologies quickly and efficiently. The Internet of Things (IoT) is a technology that is rapidly changing the retail landscape. IoT refers to the network of physical objects connected to the internet that can collect and exchange data. IoT devices can include retail store security cameras, in-store beacons, and customers’ smartphones.

By 2025, it is estimated that there will be 75.44 billion IoT devices worldwide. The retail industry is using IoT devices to collect data about customers, inventory, store operations, and more. This data can be used to improve the customer experience, increase sales, and reduce costs. In this blog post, we will explore ways IoT is used in the retail industry and its potential implications for the future.

The Retail Industry and the IoT

The retail industry is under pressure as consumers shift their spending to experiences over things. To stay relevant, retailers must find ways to enhance the customer experience in their stores. The Internet of Things (IoT) allows retailers to do this by providing customers with a seamless and personalised shopping experience.

One can use the IoT in various ways to improve the customer experience in a store. For example, one can use IoT-enabled devices to track inventory levels and provide real-time information to store associates, so they can always help customers find what they need. Additionally, one can use IoT devices to collect data about customer behaviour in the store, which one can use to improve store layout and design and optimize promotions and marketing campaigns.

Ultimately, IoT in retail creates a more efficient and enjoyable customer shopping experience. By leveraging data and technology, retailers can stay ahead of the curve and keep their customers returning for more.

Benefits of the IoT for Retail Businesses

The retail industry has always been quick to adopt new technologies that can give them a competitive edge. The Internet of Things is the latest technology retailers are adopting to improve their operations. Here are some of the benefits that the IoT can provide for retail businesses −

Improved customer service − With the IoT, retailers can collect data about their customers’ preferences and use that information to provide them with better customer service. For example, suppose a customer buys a product from a retailer’s online store. In that case, the retailer can use the IoT to track the customer’s shipping information and send them updates about the status of their order.

Reduced costs − The IoT can help retailers reduce costs in several ways. For example, by using sensors to monitor inventory levels, retailers can avoid overstocking products that may not sell well. In addition, one can also use the IoT to automate tasks such as pricing and reordering products, which can lead to significant cost savings.

Increased sales − By using the IoT to personalize the shopping experience for each customer and provide them with relevant information about products they might be interested in, retailers can increase their sales. For example, suppose a customer is looking at a product on a retailer’s website. In that case, the retailer can use the IoT to show the customer other products that are similar or related to what they are looking for.

Improved supply chain management − The latest IoT technologies can help retailers automate their supply chain and make it smarter.  With state-of-the-art IoT applications, things like inventory management, distributor commissions, sales incentive, etc., can be easily managed at one’s fingertips.

The Challenges of the IoT for Retail Businesses

The Internet of Things (IoT) is transforming the retail industry. It offers new opportunities for retailers to connect with their customers and improve operational efficiency. However, IoT also presents challenges; let’s discuss them further −

One of the main challenges of IoT is data security. Retailers collect data from their customers, including personal information and purchase history. This data is stored on servers, and authorized personnel can access them. However, if these servers are hacked, the data can be leaked or used for identity theft.

Another challenge of IoT is interoperability. Many different types of devices and systems need to work together for the IoT to function properly. However, not all devices and systems are compatible with each other. This can make it difficult for retailers to implement IoT solutions.

Finally, another challenge of IoT is scalability. As the number of devices and sensors increases, the amount of data stored does too. It can strain resources, both in terms of hardware and software.

Despite these challenges, the potential benefits of IoT for retail businesses are too great to ignore. By addressing these challenges head-on, retailers can reap the rewards of this transformative technology.

Conclusion

The Internet of Things (IoT) is a rapidly growing technology that significantly impacts the retail industry. With IoT devices, retailers can collect data about their customers and shopping habits, which one can use to improve the customer experience. In addition, one can also use IoT devices to track inventory levels and ensure that products are available when customers need them. The benefits of IoT in the retail industry are vast, and it is clear that this technology is here to stay.

Hipaa Compliance: Storage In The Cloud

The Health Insurance Portability and Accountability Act sets standards for protecting confidential patient information (HIPAA). Organizations dealing with protected health information (PHI) must implement and follow physical, network, and procedural security measures in order to be HIPAA compliant. All affected businesses (medical, payment, and surgical providers) and business partners must comply with HIPAA. Subcontractors and all other relevant business partners are subject to the same requirements as most other businesses.

The Importance of HIPAA Compliance

HIPAA compliance is more important than ever as healthcare providers and other organizations dealing with PHI move to computerized processes such as Thanks to health insurance, care management and access to self-service software are on par. All of these technologies that promote efficiency and mobility dramatically exacerbate health data security risks.

This security rule allows affected facilities to use cutting-edge technology to improve the effectiveness and quality of patient care while protecting individual privacy. The security policy is flexible in nature, allowing covered organizations to adopt procedures, methods, and tools appropriate to their size, organizational structure, and security risks for e-Protected Health Information.

Protected Health Information

The demographic information that can be used to identify a patient or customer of a HIPAA-covered business is called “Protected Health Information” (PHI). Common PHI includes, but is not limited to, name, address, phone number, social security number, medical information, financial information, and photographs.

HIPAA Compliant Cloud Storage Provider Google Cloud Drive and G Suite

In 2013, Google began signing its BAA for “G Suite,” which includes Gmail, Google Drive, Calendar, and Vault. Thanks to this ingenious move, Google Cloud Drive is now HIPAA compliant, which industry experts praise.

G Suite “includes all the controls necessary to make the service HIPAA compliant and, if accounts are properly configured and compliant, HIPAA-affected organizations can ) can be used to share PHI,” according to industry-standard security practices,” claims the HIPAA Journal.

Microsoft OneDrive Amazon (AWS)

The BAA can be signed with Amazon S3 and provides simple instructions for setting up HIPAA-compliant cloud storage using Amazon Web Services (AWS). It serves as his CSP for some of the biggest brands in healthcare and life sciences. FedRAMP and NIST 800-53 are enhanced security requirements translated into HIPAA security rules and are used to align your HIPAA risk management program with the HIPAA rules applicable to your operating model, according to the AWS Compliance Page.

Atlantic Net

Atlantic.Net Hosting is fully audited and HITECH and HIPAA compliant. The company is widely recognized for its wide range of managed security services and superior cloud platform. They are designed to help businesses meet all their cloud storage, HIPAA-compliant hosting, and cybersecurity needs. A dynamic and highly elastic storage design enables scalability to meet ever-growing needs. I also accept the Business Associate Agreement and handle all service management myself.

Dropbox Business

Dropbox Business can be configured to offer HIPAA-compliant cloud storage and offers its BAA to affected businesses. The service offers a variety of administrative controls, including user access reviews and user behavior reports. In addition, it can inspect and disconnect connected devices, as well as two-factor authentication, for added security.

Conclusion

The organization should be aware that neither the government nor the cloud service provider industry has formally confirmed HIPAA compliance, and there is no official HITECH or HIPAA certification of him. Ensuring regulatory compliance is, therefore, the responsibility of the organizations and cloud service providers involved. A cloud service must be evaluated for HIPAA regulations and may change its products, practices, and policies to help the affected organization achieve HIPAA compliance.

Cloud Computing: The Ever Expanding Buzzword

In the old days, say 2006, the term cloud computing referred to essentially one thing. To use the cloud, you accessed software over the Internet – “over the cloud.” The applications were always located in a remote location, sort of like Dick Cheney.

A couple years ago I interviewed Tim O’Brien, director of Microsoft’s Platform Strategy Group, about Redmond’s nascent cloud strategy. At the time, the cloud computing train was leaving the station and Microsoft knew it had to get on board. (Its recent Azure initiative being the most tangible result.) Amid the company’s fits and starts, O’Brien was clear in how he used the term: cloud computing meant accessing software outsidethe firewall.

But that straightforward definition has been lost to the sands of time, or at least the sandstorm of vendor excitement. As cloud computing has emerged as a red hot trend, tech vendors of every stripe have painted the term ‘cloud’ on their products, much like food brands all tout that they’re ‘low fat.’

Cloud variations keep expanding. Now we not only have Software as a Service (SaaS), but also Platform as a Service (PaaS), Hardware as a Service (HaaS) and Application as a Service (AaaS). (Actually, there is no AaaS, because even hype-crazed vendors know that it’s one acronym too far.)

Nick Carr, the IT guru and ardent cheerleader for the cloud, has even suggested the term Cloud as a Feature, or CaaF. A CaaF application combines elements that are installed on your hard drive with elements accessed over the Web. For instance, he posits that Google Earth is “kind of CaaFy.” If the term CaaF catches on, some day a poor tech blogger will write a post titled “Is your Software CaaFeinated?” That’s a day we must dread.

But of all the oddness in the gold rush of cloudspeak, the most disconcerting is how the term has lost its basic meaning as an external resource. Cloud computing can now be external or internal. That’s right, forward looking companies can now access the cloud without leaving home.

I recently spoke with Ed Walsh, the CEO of Virtual Iron, a scrappy but back-of-the-pack virtualization software firm. He used the phrase ‘build out a cloud’ to mean the same as ‘virtualizing your datacenter.’ Yet virtualization takes place inside the firewall. Virtualization software enables a server to handle multiple operating systems, and allows a roomful of servers to become a single pooled resource instead of discrete hunks of hardware. Plenty of companies are excited about virtualization – it’s a clear money saver – but are leery of cloud computing, with its hornet’s nest of security risks.

So I had to double-check with Ed about his usage: You’re using virtualization and cloudto mean the exact same thing?

“Server virtualization is more of a base technology and depending on who you talk to, they mention it in different ways,” he told me. “People say, ‘Hey, I want to take a set of server resources, pool it together, and have it seamlessly be a resource pool that I put applications on. And that could be an internal cloud. Or it can be an external cloud.”

Hmmm…internal orexternal? “Cloud becomes this word they use,” he conceded.

I also recently spoke with Ed Sims, a VC and managing partner of Dawntreader Ventures, with $290 million under management. Given that he’s always looking for hot young companies to bankroll, he’s been eyeing some cloud start-ups. “I was talking to one company that allows you to run your own cloud, in your own datacenter, and make it look like it’s an instance of Amazon EC2 or Google AppEngine,” he told me. “It’s a very nascent, early play.”

That makes sense, yet again, his use of the term was shape-shifting the cloud concept. “It’s all within, or it can be without [the firewall],” Sims said, agreeing that ‘cloud’ is now used in myriad ways.

“Obviously it’s the buzzword du jour so you have to be careful about it,” he said.

But how can you be careful about a term that now refers to something that takes place internally, or externally, or – if you accept Nick Carr’s term CaaF – a combination of the two? At some point the term gets so broad that we need to stop calling it ‘cloud computing’ and simply call it ‘computing’ – because every form of computing is an instance of cloud computing. The phrase is beginning to collapse under the weight of the multitudinous things it refers to.

David Smith, an analyst with Gartner who has written extensively about cloud computing, says the term has indeed gotten stretched.

Best Linux Distros For The Enterprise

In this article, I’ll share the top Linux distros for enterprise environments. Some of these distros are used in server and cloud environments along with desktop duties. The one constant that all of these Linux options have is that they are enterprise grade Linux distributions — so you can expect a high greater degree of functionality and, of course, support.

An enterprise grade Linux distribution comes down to the following – stability and support. Both of these components must be met to take any Linux distribution seriously in an enterprise environment. Stability means that the packages provided are both stable to use, while still maintaining an expected level of security.

The support element of an enterprise grade distribution means that there is a reliable support mechanism in place. Sometimes this is a single (official) source such as a company. In other instances, it might be a governing not-for-profit that provides reliable recommendations to good third party support vendors. Obviously the former option is the best one, however both are acceptable.

Red Hat has a number of great offerings, all with enterprise grade support made available. Their core focuses are as follows:

– Red Hat Enterprise Linux Server: This is a group of server offerings that includes everything from container hosting down to SAP server, among other server variants.

– Red Hat Enterprise Linux Desktop: These are tightly controlled user environments running Red Hat Linux that provide basic desktop functionality. This functionality includes access to the latest applications such as a web browser, email, LibreOffice and more.

– Red Hat Enterprise Linux Workstation: This is basically Red Hat Enterprise Linux Desktop, but optimized for high-performance tasks. It’s also best suited for larger deployments and ongoing administration.

Red Hat is a large, highly successful company that sells services around Linux. Basically Red Hat makes their money from companies that want to avoid vendor lock-in and other related headaches. These companies see the value in hiring open source software experts to manage their servers and other computing needs. A company need only buy a subscription and let Red Hat do the rest in terms of support.

SUSE is a fantastic company that provides enterprise users with solid Linux options. SUSE offerings are similar to Red Hat in that both the desktop and server are both focused on by the company. Speaking from my own experiences with SUSE, I believe that YaST has proven to be a huge asset for non-Linux administrators looking to implement Linux boxes into their workplace. YaST provides a friendly GUI for tasks that would otherwise require some basic Linux command line knowledge.

SUSE’s core focuses are as follows:

– SUSE Linux Enterprise Server: This includes task specific solutions ranging from cloud to SAP options, as well as, mission critical computing and software-based data storage.

– SUSE Linux Enterprise Desktop: For those companies looking to have a solid Linux workstation for their employees, SUSE Linux Enterprise Desktop is a great option. And like Red Hat, SUSE provides access to their support offerings via a subscription model. You can choose three different levels of support.

Why SUSE Linux Enterprise?

SUSE is a company that sells services around Linux, but they do so by focusing on keeping it simple. From their website down to the distribution of Linux offered by SUSE, the focus is ease of use without sacrificing security or reliability. While there is no question at least here in the States that Red Hat is the standard for servers, SUSE has done quite well for themselves both as a company and as contributing members of the open source community.

I’ll also go on record in suggesting that SUSE doesn’t take themselves too seriously, which is a great thing when you’re making connections in the world of IT. From their fun music videos about Linux down to the Gecko used in SUSE trade booths for fun photo opportunities, SUSE presents themselves as simple to understand and approachable.

Ubuntu Long Term Release (LTS) Linux is a simple to use enterprise grade Linux distribution. Ubuntu sees more frequent (and sometimes less stable) updates than the other distros mentioned above. Don’t misunderstand, Ubuntu LTS editions are considered to be quite stable. However I think some experts may disagree if you were to suggest that they’re bullet proof.

Ubuntu’s core focuses are as follows:

– Ubuntu Server: This includes server, cloud and container offerings. Ubuntu also provides an interesting concept with their Juju cloud “app store” offering. Ubuntu Server makes a lot of sense for anyone who is familiar with Ubuntu or Debian. For these individuals, it fits like a glove and provides you with the command line tools you already know and love.

Ubuntu IoT: Most recently, Ubuntu’s development team has taken aim at creation solutions for the “Internet of Things” (IoT). This includes digital signage, robotics and the IoT gateways themselves. My guess is that the bulk of the IoT growth we’ll see with Ubuntu will come from enterprise users and not so much from casual home users.

Why Ubuntu LTS?

Community is Ubuntu’s greatest strength. Both with casual users, in addition to their tremendous growth in the already crowded server market. The development and user communities using Ubuntu are rock solid. So while it may be considered more unstable than other enterprise distros, I’ve found that locking an Ubuntu LTS installation into a ‘security updates only’ mode provides a very stable experience.

First off let’s address CentOS as an enterprise distribution. If you have your own in-house support team to maintain it, then a CentOS installation is a fantastic option. After all, it’s compatible with Red Hat Enterprise Linux and offers the same level of stability as Red Hat’s offering. Unfortunately it’s not going to completely replace a Red Hat support subscription.

And Scientific Linux? What about that distribution? Well it’s like CentOS, it’s based on Red Hat Linux. But unlike CentOS, there is no affiliation with Red Hat. Scientific Linux has one mission from its inception – to provide a common Linux distribution for labs across the world. Today, Scientific Linux is basically Red Hat minus the trademark material included.

Neither of these distros are truly interchangeable with Red Hat as they lack the Red Hat support component.

Which of these is the top distro for enterprise? I think that depends on a number of factors that you’d need to figure out for yourself: subscription coverage, availability, cost, services and features offered. These are all considerations each company must determine for themselves. Speaking for myself personally, I think Red Hat wins on the server while SUSE easily wins on the desktop environment. But that’s just my opinion – do you disagree? Hit the Comments section below and let’s talk about it.

Snowflake And The Enterprise Data Platform

A new report entitled Data’s Evolution in the Cloud: The Lynchpin of Competitive Advantage explores executives’ attitude toward the essential – and challenging – process of data mining. Based on a survey conducted by The Economist and sponsored by Snowflake, the report details an industry in rapid flux, with big stakes and big challenges in current data analytics practice – focusing on the myriad innovations enabled by the cloud.

To provide insight into the the intersection of data analytics and cloud computing, I spoke with Kent Graziano, Chief Technical Evangelist, Snowflake. 

Download the podcast:



Watch the video:

Edited highlights from the conversation: 

Snowflake is definitely getting a lot of buzz. Warren Buffet, of all people, bought into Snowflake pre-IPO.

Why is Snowflake so hot? It’s cloud native, but beyond that even.

The world’s changed so much with all of the data that’s out there, and companies need a way to innovate and be more agile. And what we’re seeing with our platform is that people are able to do that.

They can come in, they can start really, really small, and grow to massive size going into petabytes of data with no management overhead, really. It’s made it so much easier than when I started in the industry 30 years ago, where you had to pre-plan everything.

And you really had to know, where are we gonna go? What’s our three-year, five-year plan? How much data do we think we’re gonna have? How many users do we think we’re gonna have? We don’t have to do that anymore.

And that’s one of the things that I loved about Snowflake, because I came in, I was really a data architect, and a modeler and designer, and it’s like, “This is great, I can actually now work with the business, figure out what data do we really need, what kind of a model should it go into, and very quickly get that up and running without having to worry about, are we gonna have enough disk space?

Are we gonna have enough compute? How many users will we really have?” And I have to size for all that. I don’t have to do any of that with Snowflake, so that really allows me to accelerate the delivery of the value to the business.

You’re saying it’s in contrast to the old days where a large data mining or data analytics application would have been in-house, and that would have been far less scalable than Snowflake?

Yeah, yeah. The on-premises world by definition, you were constrained to a box. It’s a server. It’s got so many CPUs in it, and it’s got so much disk space when you initially buy it. And yes, you can plug… You could get to the point where we could plug in SANs and we could add more disk.

But you still had to plan for that, and then you had to go through a procurement process. I had times when I was building large data warehouses where we told the infrastructure team, “We’re gonna need 10 terabytes.” And they laughed at us and said, “No, you won’t.” And they got us two terabytes, and then three months later we were out of space. And then we had to wait six weeks to get more disk space.

And so that obviously, that slowed our ability to deliver to the business down because we just physically didn’t have the infrastructure. Snowflake, you add data in and it elastically just grows. You don’t have to pre-allocate it, it’s just there on demand, and I don’t have to be a DBA or a system administrator to do anything. I just load the data in and it’s automagically there.

What about the multi-cloud piece? Is it that it works with any of the clouds? And part two of that question is, the cloud providers themselves offer data applications, many of them. Why not just use the data application already offered by one of the hyperscalers?

To answer the first part of the question is, it works on AWS, Azure and GCP. So Snowflake is cloud-agnostic, so when you’re in Snowflake and you’re in the data cloud, you’re in the data cloud. And it doesn’t matter what the underpinnings are, and that is giving people the ability to do is build a true network of data that is location-independent and cloud-independent.

Does that mean the data actually exists there [in various places], or does the data exists in other places and is being virtualized by Snowflake, as a platform?

The data has a home in a particular physical location, and the Snowflake software is managing the… I don’t like the word replication, but replication, if you will, under the covers. So it’s not virtualization.

When we talked about virtualization software, we’re talking about, “Okay, the data is over here and we’re just… We’re looking over there.” And we still have to pull it somewhere, but with Snowflake, our global data mesh is allowing that data to be replicated seamlessly to where it needs to be, where you want it to be, so it’s localized.

So you’re not in London, querying data in Australia. Though, it looks like that is what you’re doing. The data originated in Australia, but you don’t have to care now, and this is like the beauty of the cloud is you don’t have to care where the hardware is, where the data is, and then when you throw Snowflakes data cloud on top of it, now you really don’t need to care, right? That it’s handling all of that for you.

Fascinating. To wrap things up, I’d love to get your sense of the future and the future of the enterprise data platform. Maybe even more interesting, the future evolution of Snowflake. And as you answer, I’m gonna be listening to hear you say the words Artificial Intelligence.

Yeah, so I really see the future of data platforms is obviously, it’s the cloud, but it’s going way beyond what we traditionally thought of of just your basic analytics and dashboards. It is growing into that world of machine learning and artificial intelligence as the source for all that information. And one of the things we’ve learned about machine learning is the more data you have, the more accurate the results are going to be.

And now we have that ability to scale to multiple petabytes in the data cloud. So you have so much more data available to start feeding machine learning and AI types of applications and making it easy through the sharing aspects – through the network of the data – to be able to take that data and get your third party and your partner data and incorporate that all into the data. That your organization then creates themselves and can massage that, do your algorithms and projections off that. And perhaps produce a data product that others don’t have and then share that right back. And it becomes a virtuous cycle.

We’re really evolving into basically the world-wide web of data, so where you’re gonna be able to find the data you need to do the job you need to do, and to make the predictions and forecasts, and work with your customers and provide better customer service and provide more value to your stakeholders,

And to me that’s way beyond. It is probably the vision that we had 20 years ago, 30 years ago, but it took a lot of work to really make that happen, and only the largest organizations could ever afford to do it.

Now smaller organizations can do it because of the power that we have with the data cloud in particular. We’re talking about the cloud, the sky is the limit, right?

And to make it a more performant experience as the volumes of data grow and grow. I wrote about this a couple of years ago, like, “When you have all of this data available and you know how it’s being used, then it’s just a matter of time before we can be even more predictive about what data do you need, what data…[and] how are you gonna use it?

Our search optimization feature that just came out is another really smart way of being able to query the data to get the performance that you need, again – reduce that time to value even more.

So in essence, the data becomes far faster, far more flexible to shape and imagine and mold as an individual sees fit, and at the same time is also democratized for smaller players to get on board.

Exactly, that’s exactly right. Yes.

Update the detailed information about Utility Computing Infiltrates The Enterprise Storage Sector on the Eastwest.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!