Skip to main content

Snowflake Computing raises $100 million to expand cloud data warehouse footprint

Snowflake Computing, a cloud data warehouse player led by former Microsoft exec Bob Muglia, raised money to expand its engineering team and European footprint. We talked shop with Muglia.

Snowflake Computing, a cloud data warehouse vendor, has closed a $100 million funding round as it aims to expand internationally.

The company, built on Amazon Web Services, was founded in 2012 and has raised $205 million so far. The funding round was led by ICONIQ Capital and Madrona Venture Group and included its initial venture partners.

Snowflake said it will build out its engineering team as it aims to take more on-premise data warehouse workloads. CEO Bob Muglia, a former Microsoft exec, said the funding will help Snowflake scale. I caught up with Muglia to talk shop. Among the key themes:

Building on Amazon Web Services. Snowflake's cloud data warehouse service is built on AWS, which also competes via Redshift and other offerings. "AWS has treated us well. We compete with them obviously, but they are great partners," said Muglia. Translation: Netflix is built on AWS and competes with Amazon. Snowflake is a data warehouse spin on that theme.

Also: Will Snowflake spark a cloud data warehouse price war?

Where are we in the cloud data warehouse adoption curve? Muglia said data warehouses in the cloud are still in the early adoption phase internationally, but have hit an inflection point in the US. Part of Snowflake's latest funding round will be devoted to growing in Europe, which has data sovereignty laws. "In Europe there is a lot of interest," said Muglia. When Snowflake first started, the company sold to early adopters in the media, ad tech and entertainment industry. "We were competing against Redshift primarily because it was the early leader," explained Muglia. "But the world has changed and mainstream companies are moving toward cloud adoption and they are looking for solutions that work with structured, unstructured, and transactional business data."

As a result, Snowflake is increasingly competing with Oracle, IBM's Netezza, and Teradata, said Muglia. Snowflake's architecture is built to manage traditional data warehouse workloads as well as unstructured data. "We built from the ground up for the cloud," said Muglia.

Typically, a customer has an existing data warehouse and starts with Snowflake and either uses the cloud for new workloads or eventually migrates.

Gartner puts Snowflake into the niche market in its data management Magic Quadrant.

Hadoop. Snowflake is often landing customers that have become disillusioned with Hadoop. "Hadoop is an easy target because no one is happy and it's hard to make Hadoop work. It's a science project basically that's a multi-month, multi-year solution with custom SI (systems integrator) work," said Muglia. "If you're Netflix you can make Hadoop, but there are maybe 10 to 20 companies that are proficient."

The Internet of things. Muglia said that IoT creates a lot of "semi-structured" data and is complicated to manage with a structured approach. That situation is good for Snowflake. The next version of LTE will also create a surge in IoT data. Snowflake is engaging the public sector for smart city and other municipal deployments.

From:http://www.zdnet.com/article/snowflake-computing-raises-100-million-to-expand-cloud-data-warehouse-footprint/

Microsoft ‘Project Sopris’ takes aim at securing low-cost IoT devices

A new Microsoft Research team, Project Sopris, is looking to redesign microcontrollers in the name of making low-cost IoT devices more secure.

Microsoft researchers are working on a new project aimed at trying to secure low-cost Internet of Things (IoT) devices.

The Project Sopris team is "exploring the goal of security the vast number of low cost internet connected devices coming online," says the research page for the Sopris project, which was officially established March 31, 2017.

"As part of this research work, we have tested different approaches to device security from silicon to software and hypothesize that optimal device security must be rooted in hardware but kept up-to-date through evolving software," explain the researchers.

Among those working on Sopris are partner research manager Galen Hunt; principal researcher Ed Nightingale; and senior hardware architect George Letey. As unearthed by "The Walking Cat" (@h0x0d on Twitter), senior director of silicon and system architecture Rob Shearer also seems to be on the team.

Hunt has been a key member of a number of previous significant Microsoft OS research projects, including Singularity, Drawbridge, and Menlo.

The Sopris team has published its first technical report, titled "The Seven Properties of Highly Secure Devices."

That paper notes that the Sopris researchers are paying special attention to the "tens of billions of devices powered by microcontrollers," as they are not prepared for the security challenges posed by internet connectivity.

The Sopris team is working silicon partner MediaTek to revise one of their controllers -- the the Wi-Fi-enabled MT7687 -- to create a prototype of a highly secure microcontroller.
Microsoft is looking to have security researchers test the Sopris security kit via the Project Sopris Challenge. The application period for the challenge closes April 14. Microsoft is offering bounties from $2,500 to $15,000 for submissions of eligible security vulnerabilities found in its early research prototype.

Early findings indicate that "even the most price-sensitive devices should be redesigned to achieve the high levels of device security critical to society's safety," the researchers say.

From:http://www.zdnet.com/article/microsoft-project-sopris-takes-aim-at-securing-low-cost-iot-devices/

Building my own Internet of Things ambient experience, one step at a time

How a lightbulb can become another way of getting the information you want, when you want it.

First, a confession. I always wanted one of those internet-connected rabbits.

You know the ones: they'd glow yellow if it was sunny, drop an ear if the NASDAQ had conniptions, and make burbling noises when a friend was trying to Skype you and you'd turned off all your speakers.

What they were was an early example of an ambient user interface, using the things around us to tell us about the things we can't see. A stock market movement waving a rabbit ear might seem odd at first, but it's a simple signal that encodes complex information. Catching that movement out the corner of your eye gives you the opportunity to delve into richer tools and get the information that triggered the ambient action.

Buying an off-the-shelf rabbit puts you at the risk of losing service, as all those devices eventually did. So is there a more open alternative, one that's less self-consciously cute, and one that's ready for the modern API-driven world?

As I've noted in other posts, I've been experimenting with retro-fitting smart devices into a Victorian London house. It's a challenge: the building isn't designed for technology beyond a gas lamp and a coal fire, so anything I fit has to be wirelessly connected to my office network.

A recent switch to a set of Netgear Orbi wireless mesh access points has made that a lot easier, with fewer, more powerful wireless routers looking like a single node to my devices. Orbi's smarter approach to wireless networking means it's also a better neighbor to some of the low-power Wi-Fi devices that make up some of my personal 'intranet of things'.

I've already written about the Arlo and Ring cameras over the front door, and they've recently been joined by a Ring Video Doorbell. Between them they've helped me monitor a package that had been left on the doorstep while I was on the other side of London, and provide entertaining wildlife footage of the local urban foxes' 2 am antics (see my video below).

They're only part of my home's sensor network: a Nest Protect smoke detector also provides CO monitoring, while a Netatmo home weather station collects data about temperature, rain, and wind from the roof of the house. It also provides tools for monitoring local pollutant levels, a useful feature for living in a city like London.
However the thing about all these devices is that they're all very much their own thing, with their own apps on my phone and their tabs in my browser. So how can I bring them together, and how can I see what they're telling me without having to look at my phone?

The answer was inspired by that wireless rabbit. What if I built some sort of ambient information device on my desk that could quickly show me what was going on in my world, giving me the facts I need when I need them?

Philips had recently sent me a set of their Hue bulbs -- colour-changing LED bulbs with a wireless hub that connected them to my home network. I put one in an old desk lamp, in the corner of my crowded desk, and started looking at how I could connect it to a range of different services.

While the Hue hub has a local set of RESTful APIs, it's been designed so only a limited number of trusted partners get access to Philips' cloud-hosted Hue service. One option, then, was to use a Raspberry Pi running Node-Red as a personal IoT hub, linking various cloud-hosted APIs and webhooks to a freely available module that could control my lights. It was certainly an attractive option, but was there a simpler way yet?

Luckily there was. One of Philip's trusted Hue partners is the personal no-code API connection service IFTTT. If This Then That uses simple API-driven triggers to connect one service to another, and you can build a personal library of services that won't run until triggered.

With Ring and Netatmo as IFTTT inputs, and with Hue as an output, I could target that desk lamp, changing its colour in response to external events. Ring my doorbell, and the light goes blue. Is it raining? The light is green. That gives me a very simple set of tools for driving my lights from my various IoT devices.

IFTTT is useful, but it's limited. It's no-code approach limits you to known triggers and known outputs. You can't make decisions based on the content of input data, and you're limited to services where IFTTT has existing partnerships.

However the answer to the problem is hidden away in the ever growing list of services that IFTTT supports: its Maker Webhooks. Intended for users building their own IoT hardware, IFTTT's webhooks use a common web standard to give you a set of open triggers that are tied to your IFTTT account. All you need to do is create a webhook, and copy its URL with your personal authentication key. You'll notice that the URL has an {event} field. This is where you define the trigger your IFTTT applet is going to respond to. You'll also see there's the option of delivering a JSON payload from whatever triggers your webhook.

With support for multiple {event}s, you're able to build applets that connect to your Hue for a range of different possible outcomes, tying them to colours, to patterns, or just to blink the light. Things get interesting though when you take advantage of the payload, as now you can send a colour from another application to your Hue bulb.

That made me wonder: could I connect my IFTTT applet to an output from a more information oriented service like Microsoft's Flow?

The answer was, unsurprisingly, yes. Flow is able to send signals to a webhook, so all I needed to do was configure the appropriate URL for my IFTTT Maker Webhook, and hook my Flow output to an input. Flow lets you build much more complex interactions than IFTTT, with tools for handling basic conditional workflows and stacking outputs, so one input can have multiple outcomes. It's also focused on working with enterprise data sources, plugging in to Microsoft's and other cloud services.

Now, while I could have used a Flow to deliver a colour that encoded sales data from Dynamics, or my number of unread messages in Outlook from Office 365, I actually just used Flow's click button to send a signal to IFTTT that would turn the light on.

It was simple, it was relatively uncomplicated, and, best of all, it worked.

I'd connected two different no-code services together, taking advantage of their strengths to do something that one alone couldn't do. I was also a long way on my road to building an ambient information system out of my desk lamp. All I need now is a list of my inputs and I'll be ready to go.

From:http://www.zdnet.com/article/building-my-own-internet-of-things-ambient-experience-one-step-at-a-time/

The dangers of the public internet

There are three types of attacks: ones that attack the confidentiality of data, ones that attack the integrity of data and ones that attack the availability of systems

The Internet is under attack.

It has been for many years, ever since “hacker” and “malware” first crept into our vocabulary. But, the internet has grown exponentially since those days. It was never meant to handle the level of data it traffics today, a level that exceeded 1 zettabyte last year.

The internet was originally built just to share files between users. The fact that it has grown into the massive web of data and endpoints we enjoy now -- one where smartphones, tablets and smart TVs will account for nearly 70 percent of Internet traffic by 2019 -- is an enormous convenience to how we work, communicate and live.

[ An InfoWorld exclusive: Go inside a security operations center. | Discover how to secure your systems with InfoWorld's Security Report newsletter. ]
But, that evolution has been enormously convenient to cyberattackers as well, whose methods for breaching, infecting and stealing data and bringing down networks has now turned the internet and its billions of users into a digital dartboard, constantly under assault.

The CIA of data

Data breaches, DDoS (distributed denial of service) attacks, brute force decryption attacks -- these are just some of the more prominent examples of cyber and network security attacks we’ve seen grow in scale and frequency over the years. These cyberattacks come in three distinct types: ones that attack the confidentiality of data, ones that attack the integrity of data and ones that attack the availability of systems -- CIA, for short.

Target, Sony, OPM, IRS and Snapchat are just a handful of recent C and I attacks that have made some of the biggest headlines, striking at the confidentiality or integrity of millions of personal records. But, the A-attacks like a DDoS attack that takes aim at service uptime are perhaps even bigger threats, bringing networks and cloud platforms offline for extended periods of time, disrupting business continuity, and even holding network availability and data for ransom, with an average price tag of $620,000 for enterprises. Even just the threat of a DDoS attack can wreak havoc.

You don’t even need to be an intended target of these attackers to feel their impact. When a DDoS attack travels down the same line that your traffic is going, it still disrupts or shuts down your service all the same. Your traffic is moving along the same line as the target, and becomes collateral damage.

Internet apathy

The common thread through these CIA attacks is the public internet. As long as enterprises and end users are using the public internet for transmitting and storing data, they will always be putting themselves at risk.

Enterprises, SaaS providers and cloud services don’t have to operate over the public internet, but they choose to because it has always made sense economically. The alternative has always been costlier and more complex.

But, why then, do they still accept all of those risks that are becoming more serious every year? Because the public internet is “good enough.” It works most of the time already, and when you’ve been used to it for 25 years, why rock the boat?

Enterprises and SaaS providers only start to reevaluate their reliance on the public internet until they have a bad experience -- a network breach, a disrupted conference call, something that negatively impacts a business-critical service. By then, it’s too late, assuming they even treat it like a wake-up call at all.

What’s especially troubling is that these same enterprises are now relying on the cloud for their mission-critical apps. That means moving some of their most critical processes to a place that can only be accessed through a pipe that is notoriously unreliable and unsafe.

That’s just bad business sense. But, because these problems only occur intermittently, they’re not thought of as serious concerns ahead of time. More than that, many enterprises simply don’t have the network savvy to properly diagnose why they were attacked in the first place, and how the public internet itself is ultimately culpable.

Looking outside the box

The public internet is the root problem, and any solution that ignores that will only be attacking the symptoms, not the cause. That’s why enterprises and their partners need to look outside of the box -- in this case, the internet itself -- to find their way out from the dangers of the public Internet that will only become more dangerous as time goes on.

Solutions that can provide private, reliable connections outside of the internet, such as interconnections and VPNs, provide a new way for organizations to network without having to worry about their data being suddenly impeded, stolen or shut down by an attacker. So long as enterprises aren’t thinking about how to move their operations and data traffic around the public pathways of the internet, they will find themselves under the constant threat of the next CIA attacks just around the corner.

From:http://www.infoworld.com/article/3172730/security/the-dangers-of-the-public-internet.html

For internet privacy, a VPN won’t save you

In theory, getting a VPN is good advice. But the technology hasn't caught up to modern standards yet, and some "safe" services could put you at greater risk.

Last week, Congress voted to gut proposed internet privacy rules set out by the outgoing Obama administration that would have prevented your internet provider from selling your browser history to advertisers. President Donald Trump signed the bill a day after, making it law.

Many turned to what appeared to be an obvious solution: A virtual private network (VPN).

The idea of using a VPN is simple enough. The good ones are designed to push your internet traffic through a protected and secured tunnel, which shields your browsing records -- such as the websites you view -- from your internet provider. (As a result, some VPNs push your internet traffic through servers in other countries to trick content providers, like Netflix, into thinking you're in a different place -- usually in order to gain access to content in other geographies.)

But VPNs, for the most part, are lousy, often over capacity, and almost always significantly reduce your internet speeds. And, sometimes services simply don't work or load because they can detect you're using a VPN, forcing you to jump off the VPN -- effectively defeating the point of using the service on a long-term basis.

And, a lot of the time, the bad ones won't protect your privacy as they promise.

Some services are better than others. We're not here to tell you the best ones or pick sides, but there are some pointers to note from our sister-site CNET and here on ZDNet. For example, paid services are usually better at hiding your traffic than free services where the customer is usually the product.

But what compounds the problem is that some phony VPN services promise to protect your privacy, but they don't and are simply cashing in on the news, said Motherboard.

The big question to ask yourself is: Should I trust this VPN provider? More often than not, you can't and shouldn't.

Why? Not least because VPN providers don't always encrypt your web traffic, or don't use their own domain name servers (which means your internet provider can still see the websites you're accessing), and some are using their own in other countries, which means you're beholden to their laws. As security researcher Troy Hunt said in a recent blog post, because VPN providers control your traffic, "they can inspect it, modify it, log it, and have a very good idea of what it is you're up to."

As security reporter Brian Krebs notes, many VPN providers "claim they keep zero records of customer activity," but "this is almost always untrue if you take the time to read the fine print."

Often, the reality most will face is that you're paying for a VPN service that you have to trust more than your internet provider not to collect, monitor, or sell your data.

As famed security sensation Swift On Security said in a recent tweet:

It's not to say that there aren't good VPN providers out there, but you have to weigh up the reasons why you want to protect your browsing history and other data.

When push comes to shove, there are better ways to protect your browsing data than using a VPN.

And while nothing is perfect, and the web will never be completely secure (nothing ever is), you're better off taking advantage of plugins like HTTPS Everywhere, which pushes for secure pages over non-secure pages where available.

Amazon, Google, Facebook, Twitter -- and yes, even Pornhub and YouPorn, your favorite online adult destinations, all offer HTTPS by default, which masks the page and its content (albeit not the domain) from internet-browsing snoopers.

And when all else fails, your internet provider isn't going to be able to monitor your activity on the Tor anonymity network any time soon.

And there's almost never going to be a widespread adoption of a VPN service from the average internet user, nor should there be. And many will inherently choose convenience, ease, and faster speeds over security and slowdowns, defeating their point altogether.

Given that this entire saga started with Congress voting to scrap internet privacy rules, the question to ask isn't "how do I protect my internet history," it's "how do we get out of this mess?"

From:http://www.zdnet.com/article/for-internet-privacy-a-vpn-will-not-save-you/

How to use a VPN to protect your internet privacy

A virtual private network can go a long way to make sure that neither your ISP, nor anyone else, can snoop on what you do on the internet.

Worried about your ISP? Is someone on your coffee shop's Wi-Fi? Or is Joe A Hacker bugging your internet? A virtual private network (VPN) can help protect your privacy.

A VPN uses encryption technologies, such as IP security (IPSec), Layer 2 Tunneling Protocol (L2TP)/IPSec, and Secure Sockets Layer (SSL) and Transport Layer Security (TLS), to create a virtual encrypted "tunnel" between your device and a VPN server. While your traffic is in this tunnel between you and a VPN server, no one can see where you're going or what you're doing.

Besides protecting your privacy, VPN services are also commonly used for BitTorrent and other Peer-to-Peer (P2P) traffic since many ISPs frown on file-sharing. People also use VPNs to watch streaming video services, such as Netflix and Hulu, in areas where they aren't legally available. In recent years, the streaming services have taken steps to prevent VPNs from carrying their traffic.

While you can set up your own VPN server, such as OpenVPN on Ubuntu, that's too much work for most people. Your employer may offer VPN services for remote users; if so, ask if you can use it from home.

VPN services

For most people the answer is to use a VPN service. These companies enable you to create a VPN between your gadgets and their internet connection. Once your connection is on the other side of their VPN server, your traffic emerges without signs of who you are or where you're connecting from.

There's one fundamental concern with VPN services: Can you trust them not to track you? Some VPNs keep their own records of where you go on the net. If privacy is a real concern for you, check your VPN's terms and policies to see if they keep logs of your online activities. If they do, look for another VPN.

Some VPNs are far shadier than just logging your visits. In 2015, the free VPN service Hola was found to be selling its users' bandwidth to its Luminati service's paying customers.

You should also know before subscribing to a VPN service that you can be almost certain your internet speed will decline. That's because you're going to be sharing the VPN's broadband connection with other users. As always, an internet connection is only as fast as its slowest link.

The great majority of VPN services require you to install an application on your device. Many, but not all of them, also support Android and iOS, so you can secure your mobile traffic.

Most VPN services charge for their services. After all, a VPN provider must, at a bare minimum, pay for its own network equipment and broadband. Nonetheless, there are some decent free VPN services.

Some, such as Spotflux and Hotspot Shield, do this by placing ads in your stream. Others, including Steganos Online Shield and TunnelBear, will give you a free tunnel for a limited amount of traffic. These two offer 500 Megabytes of bandwidth per month. Of the free services, I prefer Spotflux, but if you have minimal bandwidth needs, TunnelBear is also worthwhile.

As for the paid services, what you want is one with lots of bandwidth and multiple sites. Before subscribing to any of these services, try them out first. Many of them offer free trials, and it's worth taking them up on this. VPN performance varies wildly -- not just from company to company but from place to place. If you live near a VPN endpoint that's constantly overloaded you won't be happy, even if your brother across the country is getting great performance with the same VPN service.

That said, I've been using VPNs for over a decade and I've used many of them. The ones that have worked best for me are Banana VPN, NordVPN, Private Internet Access VPN, StrongVPN, and ZenMate.

Their prices vary. Generally speaking, the longer term you sign up for, such as a year paid in advance, the cheaper the subscription fee. This typically drops the price below $10 a month. But, as I mentioned, try the service first before getting locked into a long-term contract.

Installing a VPN tends to be mindlessly simple. The Opera web browser even comes with a built-in VPN these days.

Beyond VPNs

There are also other services that look like VPNs, one of which is free web proxies. A web proxy is a server that acts as a middleman between you and and a website. It sounds good, but about 75 percent of all free web proxies have recently been shown to be untrustworthy. If you're already using one and want to know if it's OK, you can test it with ProxyCheck.

Another popular privacy solution is Tor. This is a software and network pairing that hides your identity by moving your traffic across different Tor servers, and encrypting that traffic. However, there's every reason to believe that Tor isn't as secure as its reputation. The Justice Department recently dropped a case because it didn't want to reveal how it had cracked Tor.

Your ISP isn't going to be cracking your Tor connection. For most people the real problem with Tor is that its connections tend to be very slow.

Do you really need to worry about any of this? I think you do. Sure, the major ISPs claim they're not going to spy on you, but I don't believe them. Even before the government decided to let the ISPs sell your browsing history, the big ISPs have had a track record of playing fast and loose with your privacy.

For most people, the best internet privacy solution is a good, fast VPN provider you can trust. Give the ones I suggested a try. I'm sure you'll find one you like.

From:http://www.zdnet.com/article/how-to-use-a-vpn-to-protect-your-internet-privacy/

IBM aims to commercialize quantum computing, launches API, SDK and sees Q systems in next few years

IBM put some more meat on its roadmap and plans to commercialize quantum computing for enterprises. For now, developers will get APIs and a software developer kit to play with qubits.

IBM is launching and application programming interface (API) and software developer kit for public access to quantum computing via IBM Cloud. The company also outlined plans to make commercial quantum computing systems in the next few years.

The move is the latest in IBM's effort to commercialize quantum computing. Quantum computing is the use of quantum theory and using subatomic particles to store information. Quantum computing is expected to be much faster than today's systems and capable of performing any tasks.

Last year, IBM launched the Quantum Experience, which enabled developers to run algorithms and experiments on the company's quantum processor and work with individual quantum bits, or qubits. Here's a look at the Quantum Experience and formulating a task.

With the API, IBM Cloud will give developers a portal and access to a 5 qubit quantum computer. IBM is hoping developers will use the tool to build interfaces and experiment.

IBM also updated its quantum simulator to experiment with up to a 20 qubit computer. Big Blue sees quantum computing impacting everything from drug and materials discovery to logistics to financial services. There will also be implications for machine learning and artificial intelligence and cloud security.

According to the company, a full SDK on the IBM Quantum Experience will be available in the first half of 2017.

To commercialize quantum computing, IBM said it will build IBM Q systems with about 50 qubits "in the next few years." These systems would give IBM an asset that it could use to engage industry and develop applications. However, the most likely model for quantum computing consumption is likely to be delivered via the cloud.

To date, 40,000 users have run more than 275,000 experiments on IBM's Quantum Experience.

From:http://www.zdnet.com/article/ibm-aims-to-commercialize-quantum-computing-launches-api-sdk-and-sees-q-systems-in-next-few-years/

IBM and Red Hat aim to boost hybrid cloud computing, OpenStack usage

Big Blue touts hybrid cloud deals with Red Hat and Veritas.

IBM and Red Hat are working together to encourage the use of OpenStack and make it easier for to companies to shift their Linux workloads into private clouds.

IBM said that Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud will be generally available at the end of March, ahead of which IBM has become a Red Hat Certified Cloud and Service Provider. Big Blue said this would help "enterprises benefit from the OpenStack Platform's speed and economics".

Also as part of the agreement, Red Hat Cloud Access will become available for IBM Cloud by the end of the second quarter, allowing Red Hat customers to move unused Red Hat Enterprise Linux subscriptions from their data centers into IBM Cloud data centers worldwide.

Red Hat Cloud Access allows Linux customers to retain services and support while moving workloads into the cloud.

"Our collaboration with IBM is aimed at helping enterprise customers more quickly and easily embrace hybrid cloud," said Radhesh Balakrishnan, general manager of OpenStack at Red Hat. "Now, customers who don't have in-house expertise to manage an OpenStack infrastructure can more confidently consume Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud."

IBM and Red Hat said they will provide the hybrid cloud infrastructure to help customers more efficiently run cloud applications using OpenStack APIs. Customers will be able to provision cloud infrastructure faster and, using Red Hat Cloud Access, migrate existing workloads and Red Hat subscriptions to IBM Cloud, or use the software and infrastructure on a pay-as-you-go basis.

IBM and Red Hat said they will jointly sell new offerings for private cloud deployments, including workload migrations, disaster recovery, capacity expansion and data center consolidation.

IBM has also signed a deal with Veritas, also around hybrid cloud, to help enterprises working with increasing data volumes better manage, optimize and protect data across hybrid cloud environments.

Veritas has certified the IBM Cloud Object Storage family of software and cloud services for use with Veritas NetBackup 8.0, making it easier for customers to migrate data from on-premises systems to the cloud for greater storage capabilities.

In turn, IBM has certified NetBackup 8.0 to run on the IBM Cloud to offer clients additional data protection for cloud-based workloads. NetBackup 8.0 is due to be available in the second quarter and will be available for order from the IBM Bluemix Catalog of services.

From:http://www.zdnet.com/article/ibm-and-red-hat-aim-to-boost-hybrid-cloud-computing-openstack-usage/

Cloud computing is the new normal: Is it time to use it for everything?

The big question is not whether you should use cloud computing, but what happens if you use it for everything

On-demand IT has reached a tipping point and organisations of all sizes and sectors are using cloud computing services to run and develop their businesses.

But where does the cloud go next and what are some of the interesting use cases that will help take cloud to the next level?

Four business and tech leaders discuss what the cloud now means for their businesses.

1. Overcoming legacy concerns to leave the internal data centre

Okta CIO Mark Settle runs his organisation, an identity management specialist, using about 140 cloud-based applications. "I have no data centre to worry about," he says. "It makes the budgeting cycle so much easier. You basically look at your list of SaaS subscription fees and project what the future costs will be like. It can be done in as little as 90 minutes."

Settle recognises that this shift from capital to operational expenditure will have a fundamental impact on the role of the IT leader. "It's the future and it's also a very different approach from the one I've taken in any of my previous businesses," he says, looking back on a career that has included seven CIO positions.

Settle believes the cloud is now a business-as-normal activity. "Almost all executives are looking to go cloud first now -- there's very few people writing software and buying new servers to run those applications in a data centre," he says. However, he also appreciates that key challenges remain, particularly regarding legacy applications.

"On the infrastructure side, the cloud has moved from something used for testing and development to a platform for production services. People are becoming increasingly confident moving systems to the cloud and getting their stuff out of the internal data centre," says Settle.

"However, it's also a fallacy to think that large, global enterprises are going to completely abandon their data centres. I think there'll always be legacy applications that need to be maintained in-house, be that for cost reasons or a desire not to disrupt how systems work currently. The hope has to be that cutting-edge work around containerisation will help some of the doubters to deal with their legacy concerns."

2. Using on-demand IT for almost everything

The future of the cloud, says CIO consultant Andrew Abboud, is very closely related to preconceived notions of on-demand IT. "Let's get this straight," he says. "Executives around the business don't talk about the cloud -- they're not interested in the technology per se, they just want to solve the business challenges they face."

CIOs must help ensure the hype surrounding the IT industry does not get in the way. "As technologists, we get hung up on buzzwords when we should be focused on the opportunities," says Abboud. "Every organisation is different and every business must understand how the cloud will deliver benefits."

Once the CIO has helped the rest of the business to establish the context of implementation, the key debate is simply how far an organisation can push its use of on-demand IT. "We're seeing a move towards online services across business and, in the future, the key question concerns saturation -- in most cases, why wouldn't you use the cloud for everything?"

Abboud recognises concerns persist, such as around information security and the porting of legacy applications. But he is hopeful such challenges can be overcome effectively. "If you accept the logic that an external cloud provider is going to be more secure than an internal data centre, then you should really push as much of your business to the cloud as possible," says Abboud.

"Legacy applications can be a problem, especially in the finance sector. But every industry must bite the bullet and transform eventually. CIOs need to appreciate that the Cobol specialists will die out -- you have to deal with change now."

3. Boosting real-time marketing and sales communications

Experienced CMO Sarah Speake says cloud computing is a welcome addition to the marketeer's digital kit bag. Analysts spend a great deal of time investigating the role of CIOs and CMOs in an age of decentralised purchasing. Speake says the next frontier for the cloud involves CMOs helping their departments to make the most of on-demand capability.

"We should take collective responsibility for up-skilling our teams sufficiently to navigate cloud-based apps and tools appropriately to drive speed, efficiency and transparency," she says. Speake, who is an experienced CMO who has held senior marketing positions at ITV and Google, says cloud-based systems can help marketers move away from dangerous assumptions.

"For all too long, we shared key documents, like Excel spreadsheets, internally via email, never knowing whether the one we were inputting into or scrutinising to drive in-depth customer segmentation was the most current or not," she says. "Accuracy was an unknown quantity, so our responsibilities in driving additional leads and revenues to the bottom line were often hard to prove."

Speake says cloud computing can provide further boosts for marketeers, such as through real-time access to customer relationship management data or via integrated marketing automation and communication. Once again, she says CMOs -- rather than CIOs -- can help people across the business to make the most of cloud-based services.

"In part, our role is to help our sales friends continuously assess customer prioritisation, depending on short- and long-term revenue potential," she says. "Equally, we will need to revise our own marketing communications to ensure we're driving leads or maintaining existing customers depending on our organisational business model."

4. Enabling education and development from any location

Matt Britland, director of ICT at Lady Eleanor Holles School, says his school uses Google Apps for Education and Microsoft Office 365. Both implementations are managed internally by the school. He says sensitive data relating to the school is not stored in the cloud. However, the technology is already playing a key role in education -- and that part will only grow in coming years.

"The cloud allows our students to work from any location as long as they have an internet-connected device," says Britland. "The cloud has to be part of the future of education because it enables learning to happen everywhere."

He is currently running a cloud-based project that allows pupils to work in teams and collaborate on the same project. The school's use of cloud-based productivity apps is also extended to staff. Britland says the right preparations are crucial.

"It can be a challenge explaining the software and its benefits to people who are new to the cloud," he says. "You have to put the right training in place and I've been running training courses. Most education professionals, however, are keen to learn and explore new opportunities."

From:http://www.zdnet.com/article/cloud-computing-is-the-new-normal-is-it-time-to-use-it-for-everything/

Internet of Things security: What happens when every device is smart and you don’t even know it?

When IoT devices are everywhere, the security headaches just get worse.

Billions more everyday items are set to be connected to the internet in the next few years, especially as chips get cheaper and cheaper to produce -- and crucially, small enough to fit into even the smallest product.

Potentially, any standard household item could become connected to the internet, even if there's no reason for the manufacturers to do so.

Eventually that processors needed to power an IoT device will become effectively free, making it possible to turn anything into a internet-enabled device.

"The price of turning a dumb device into a smart device will be 10 cents," says Mikko Hyppönen, chief research officer at F-Secure.

However, it's unlikely that consumer will be the one who gains the biggest benefits from every device their homes collecting data; it's those who build them who will reap the greatest rewards -- alongside government surveillance services.

"It's going to be so cheap that vendors will put the chip in any device, even if the benefits are only very small. But those benefits won't be benefits to you, the consumer, they'll be benefits for the manufacturers because they want to collect analytics," says Hyppönen, speaking at Cloud Expo Europe.

For example, a kitchen appliance manufacturer might collect data and use it for everything from seeing how often the product breaks to working out where customers live and altering their advertising accordingly in an effort to boost sales -- and the user might not even know this is happening, if devices have their own 5G connection and wouldn't even need access to a home Wi-Fi network.

"The IoT devices of the future won't go online to benefit you -- you won't even know that it's an IoT device," says Hyppönen.

"And you won't be able to avoid this, you won't be able to buy devices which aren't IoT devices, you won't be able to restrict access to the internet because they won't be going online through your Wi-Fi. We can't avoid it, it's going to happen."

Indeed, it's already started, with devices you wouldn't expect to need an internet connection -- including children's toys -- being discovered to have gaping cybersecurity vulnerabilities.

These scenarios, says Darren Thomson, CTO & vice president of technology services at Symantec, are occurring because those in the technology industry are thinking about whether they could connect things to the internet, but aren't thinking about whether they should.

"Could I attach my dog to the internet? Could I automate the process of ordering a taxi on my mobile phone? We're obsessed with could we problems. That's how we live our lives and careers, we invent things and we solve problems. We're good at 'Could we'," he said, also speaking at Cloud Expo Europe.

No matter the reason why things are being connected to the internet, Thomson agrees with Hyppönen about what the end goal is: data collection.

"The connectivity of those devices is impressive and important. But what's more important is how that's coming to bare across various markets. Every single sector on the planet is in a race to digitise, to connect things. And very importantly, to collect data from those things," he says.

However, various incidents have demonstrated how the Internet of Things is ripe with security vulnerabilities as vendors put profit and speed to market before anything else, with cybersecurity very low down the list of priorities.

Retrofitting updates via the use of patches might work for a PC, a laptop or even a smartphone, but there are huge swathes of devices -- and even whole internet-connected industrial or urban facilities -- for which being shutdown in order to install and update is impossible.

"The security industry to date is predicated on the benefit of the retrofit. IT has designed insecure systems then we've secured them. That's kind of OK in a world where a device can have some downtime," says Thomson.

"But a car, a building, a city, a pipeline, a nuclear power facility can't tolerate downtime. So if we don't build security and privacy in to our designs from the very first whiteboard, we're going to leave ourselves with a problem."

Not only that, but as IoT devices become more and more common, people will start to ignore them

"The reality of the human mind is as we embed things, we tend to forget about them, we get complacent about them. Many of you are probably wearing a smart device on your wrist to monitor your behaviour and exercise routines. But no doubt two weeks after you started wearing it, you forgot it was there," he says.

"The danger from a psychological perspective is that people forget about that technology and forget about the risks associated with it and our own personal mitigation of that risk."

Even now, consumers are too blasé about connected devices, keen to jump on the latest technological trends failing to realise the associated security risks. Then even if they do, they remain unclear on how to secure the IoT devices -- that is, if there is the option of securing it in the first place.

"Nobody reads the manual, especially to page 85 where it says how to change the default credentials, or page 90 where it says how to set up user accounts and restrict access to the admin interface, or page 100 where it says how to segment your network," says Hyppönen.

He likens it to the "exact same problem we had in the 80s" when people wouldn't even bother to set a time on their video recorder as it involved picking up the manual, so it'd end up always flashing 12:00.

It's therefore important for the Internet of Things cybersecurity loopholes to be shut sooner rather than later so as to avoid nightmare scenarios where hackers could exploit vulnerabilities to attack anything from pacemakers and other medical devices, to connected cars to even entire industrial facilities.

But are IoT device manufacturers going to do this anytime soon? Probably not.

"The manufacturers of IoT devices are unlikely to fix this by themselves. They're unlikely to start investing more money in their IoT devices for security because money is the most important thing in home appliances," says Hyppönen

"When you buy a washing machine, price is the most important selling point. Nobody's asking, 'does it have a firewall or intrusion prevention systems?' Cybersecurity isn't a selling point for a washing machine, so why would manufacturers invest money in it?" he adds.

It might eventually be regulation which has to fix this problem; as Hyppönen points out, device safety is already regulated. "When you buy a washing machine, it must not short circuit and catch fire, we regulate that. Maybe we should regulate security," he says.

From:http://www.zdnet.com/article/internet-of-things-security-what-happens-when-every-device-is-smart-and-you-dont-even-know-it/