Monthly archives for March, 2017

IBM aims to commercialize quantum computing, launches API, SDK and sees Q systems in next few years

IBM put some more meat on its roadmap and plans to commercialize quantum computing for enterprises. For now, developers will get APIs and a software developer kit to play with qubits.

IBM is launching and application programming interface (API) and software developer kit for public access to quantum computing via IBM Cloud. The company also outlined plans to make commercial quantum computing systems in the next few years.

The move is the latest in IBM’s effort to commercialize quantum computing. Quantum computing is the use of quantum theory and using subatomic particles to store information. Quantum computing is expected to be much faster than today’s systems and capable of performing any tasks.

Last year, IBM launched the Quantum Experience, which enabled developers to run algorithms and experiments on the company’s quantum processor and work with individual quantum bits, or qubits. Here’s a look at the Quantum Experience and formulating a task.

With the API, IBM Cloud will give developers a portal and access to a 5 qubit quantum computer. IBM is hoping developers will use the tool to build interfaces and experiment.

IBM also updated its quantum simulator to experiment with up to a 20 qubit computer. Big Blue sees quantum computing impacting everything from drug and materials discovery to logistics to financial services. There will also be implications for machine learning and artificial intelligence and cloud security.

According to the company, a full SDK on the IBM Quantum Experience will be available in the first half of 2017.

To commercialize quantum computing, IBM said it will build IBM Q systems with about 50 qubits “in the next few years.” These systems would give IBM an asset that it could use to engage industry and develop applications. However, the most likely model for quantum computing consumption is likely to be delivered via the cloud.

To date, 40,000 users have run more than 275,000 experiments on IBM’s Quantum Experience.


IBM and Red Hat aim to boost hybrid cloud computing, OpenStack usage

Big Blue touts hybrid cloud deals with Red Hat and Veritas.

IBM and Red Hat are working together to encourage the use of OpenStack and make it easier for to companies to shift their Linux workloads into private clouds.

IBM said that Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud will be generally available at the end of March, ahead of which IBM has become a Red Hat Certified Cloud and Service Provider. Big Blue said this would help “enterprises benefit from the OpenStack Platform’s speed and economics”.

Also as part of the agreement, Red Hat Cloud Access will become available for IBM Cloud by the end of the second quarter, allowing Red Hat customers to move unused Red Hat Enterprise Linux subscriptions from their data centers into IBM Cloud data centers worldwide.

Red Hat Cloud Access allows Linux customers to retain services and support while moving workloads into the cloud.

“Our collaboration with IBM is aimed at helping enterprise customers more quickly and easily embrace hybrid cloud,” said Radhesh Balakrishnan, general manager of OpenStack at Red Hat. “Now, customers who don’t have in-house expertise to manage an OpenStack infrastructure can more confidently consume Red Hat OpenStack Platform and Red Hat Ceph Storage on IBM Private Cloud.”

IBM and Red Hat said they will provide the hybrid cloud infrastructure to help customers more efficiently run cloud applications using OpenStack APIs. Customers will be able to provision cloud infrastructure faster and, using Red Hat Cloud Access, migrate existing workloads and Red Hat subscriptions to IBM Cloud, or use the software and infrastructure on a pay-as-you-go basis.

IBM and Red Hat said they will jointly sell new offerings for private cloud deployments, including workload migrations, disaster recovery, capacity expansion and data center consolidation.

IBM has also signed a deal with Veritas, also around hybrid cloud, to help enterprises working with increasing data volumes better manage, optimize and protect data across hybrid cloud environments.

Veritas has certified the IBM Cloud Object Storage family of software and cloud services for use with Veritas NetBackup 8.0, making it easier for customers to migrate data from on-premises systems to the cloud for greater storage capabilities.

In turn, IBM has certified NetBackup 8.0 to run on the IBM Cloud to offer clients additional data protection for cloud-based workloads. NetBackup 8.0 is due to be available in the second quarter and will be available for order from the IBM Bluemix Catalog of services.


Cloud computing is the new normal: Is it time to use it for everything?

The big question is not whether you should use cloud computing, but what happens if you use it for everything

On-demand IT has reached a tipping point and organisations of all sizes and sectors are using cloud computing services to run and develop their businesses.

But where does the cloud go next and what are some of the interesting use cases that will help take cloud to the next level?

Four business and tech leaders discuss what the cloud now means for their businesses.

1. Overcoming legacy concerns to leave the internal data centre

Okta CIO Mark Settle runs his organisation, an identity management specialist, using about 140 cloud-based applications. “I have no data centre to worry about,” he says. “It makes the budgeting cycle so much easier. You basically look at your list of SaaS subscription fees and project what the future costs will be like. It can be done in as little as 90 minutes.”

Settle recognises that this shift from capital to operational expenditure will have a fundamental impact on the role of the IT leader. “It’s the future and it’s also a very different approach from the one I’ve taken in any of my previous businesses,” he says, looking back on a career that has included seven CIO positions.

Settle believes the cloud is now a business-as-normal activity. “Almost all executives are looking to go cloud first now — there’s very few people writing software and buying new servers to run those applications in a data centre,” he says. However, he also appreciates that key challenges remain, particularly regarding legacy applications.

“On the infrastructure side, the cloud has moved from something used for testing and development to a platform for production services. People are becoming increasingly confident moving systems to the cloud and getting their stuff out of the internal data centre,” says Settle.

“However, it’s also a fallacy to think that large, global enterprises are going to completely abandon their data centres. I think there’ll always be legacy applications that need to be maintained in-house, be that for cost reasons or a desire not to disrupt how systems work currently. The hope has to be that cutting-edge work around containerisation will help some of the doubters to deal with their legacy concerns.”

2. Using on-demand IT for almost everything

The future of the cloud, says CIO consultant Andrew Abboud, is very closely related to preconceived notions of on-demand IT. “Let’s get this straight,” he says. “Executives around the business don’t talk about the cloud — they’re not interested in the technology per se, they just want to solve the business challenges they face.”

CIOs must help ensure the hype surrounding the IT industry does not get in the way. “As technologists, we get hung up on buzzwords when we should be focused on the opportunities,” says Abboud. “Every organisation is different and every business must understand how the cloud will deliver benefits.”

Once the CIO has helped the rest of the business to establish the context of implementation, the key debate is simply how far an organisation can push its use of on-demand IT. “We’re seeing a move towards online services across business and, in the future, the key question concerns saturation — in most cases, why wouldn’t you use the cloud for everything?”

Abboud recognises concerns persist, such as around information security and the porting of legacy applications. But he is hopeful such challenges can be overcome effectively. “If you accept the logic that an external cloud provider is going to be more secure than an internal data centre, then you should really push as much of your business to the cloud as possible,” says Abboud.

“Legacy applications can be a problem, especially in the finance sector. But every industry must bite the bullet and transform eventually. CIOs need to appreciate that the Cobol specialists will die out — you have to deal with change now.”

3. Boosting real-time marketing and sales communications

Experienced CMO Sarah Speake says cloud computing is a welcome addition to the marketeer’s digital kit bag. Analysts spend a great deal of time investigating the role of CIOs and CMOs in an age of decentralised purchasing. Speake says the next frontier for the cloud involves CMOs helping their departments to make the most of on-demand capability.

“We should take collective responsibility for up-skilling our teams sufficiently to navigate cloud-based apps and tools appropriately to drive speed, efficiency and transparency,” she says. Speake, who is an experienced CMO who has held senior marketing positions at ITV and Google, says cloud-based systems can help marketers move away from dangerous assumptions.

“For all too long, we shared key documents, like Excel spreadsheets, internally via email, never knowing whether the one we were inputting into or scrutinising to drive in-depth customer segmentation was the most current or not,” she says. “Accuracy was an unknown quantity, so our responsibilities in driving additional leads and revenues to the bottom line were often hard to prove.”

Speake says cloud computing can provide further boosts for marketeers, such as through real-time access to customer relationship management data or via integrated marketing automation and communication. Once again, she says CMOs — rather than CIOs — can help people across the business to make the most of cloud-based services.

“In part, our role is to help our sales friends continuously assess customer prioritisation, depending on short- and long-term revenue potential,” she says. “Equally, we will need to revise our own marketing communications to ensure we’re driving leads or maintaining existing customers depending on our organisational business model.”

4. Enabling education and development from any location

Matt Britland, director of ICT at Lady Eleanor Holles School, says his school uses Google Apps for Education and Microsoft Office 365. Both implementations are managed internally by the school. He says sensitive data relating to the school is not stored in the cloud. However, the technology is already playing a key role in education — and that part will only grow in coming years.

“The cloud allows our students to work from any location as long as they have an internet-connected device,” says Britland. “The cloud has to be part of the future of education because it enables learning to happen everywhere.”

He is currently running a cloud-based project that allows pupils to work in teams and collaborate on the same project. The school’s use of cloud-based productivity apps is also extended to staff. Britland says the right preparations are crucial.

“It can be a challenge explaining the software and its benefits to people who are new to the cloud,” he says. “You have to put the right training in place and I’ve been running training courses. Most education professionals, however, are keen to learn and explore new opportunities.”


Internet of Things security: What happens when every device is smart and you don’t even know it?

When IoT devices are everywhere, the security headaches just get worse.

Billions more everyday items are set to be connected to the internet in the next few years, especially as chips get cheaper and cheaper to produce — and crucially, small enough to fit into even the smallest product.

Potentially, any standard household item could become connected to the internet, even if there’s no reason for the manufacturers to do so.

Eventually that processors needed to power an IoT device will become effectively free, making it possible to turn anything into a internet-enabled device.

“The price of turning a dumb device into a smart device will be 10 cents,” says Mikko Hyppönen, chief research officer at F-Secure.

However, it’s unlikely that consumer will be the one who gains the biggest benefits from every device their homes collecting data; it’s those who build them who will reap the greatest rewards — alongside government surveillance services.

“It’s going to be so cheap that vendors will put the chip in any device, even if the benefits are only very small. But those benefits won’t be benefits to you, the consumer, they’ll be benefits for the manufacturers because they want to collect analytics,” says Hyppönen, speaking at Cloud Expo Europe.

For example, a kitchen appliance manufacturer might collect data and use it for everything from seeing how often the product breaks to working out where customers live and altering their advertising accordingly in an effort to boost sales — and the user might not even know this is happening, if devices have their own 5G connection and wouldn’t even need access to a home Wi-Fi network.

“The IoT devices of the future won’t go online to benefit you — you won’t even know that it’s an IoT device,” says Hyppönen.

“And you won’t be able to avoid this, you won’t be able to buy devices which aren’t IoT devices, you won’t be able to restrict access to the internet because they won’t be going online through your Wi-Fi. We can’t avoid it, it’s going to happen.”

Indeed, it’s already started, with devices you wouldn’t expect to need an internet connection — including children’s toys — being discovered to have gaping cybersecurity vulnerabilities.

These scenarios, says Darren Thomson, CTO & vice president of technology services at Symantec, are occurring because those in the technology industry are thinking about whether they could connect things to the internet, but aren’t thinking about whether they should.

“Could I attach my dog to the internet? Could I automate the process of ordering a taxi on my mobile phone? We’re obsessed with could we problems. That’s how we live our lives and careers, we invent things and we solve problems. We’re good at ‘Could we’,” he said, also speaking at Cloud Expo Europe.

No matter the reason why things are being connected to the internet, Thomson agrees with Hyppönen about what the end goal is: data collection.

“The connectivity of those devices is impressive and important. But what’s more important is how that’s coming to bare across various markets. Every single sector on the planet is in a race to digitise, to connect things. And very importantly, to collect data from those things,” he says.

However, various incidents have demonstrated how the Internet of Things is ripe with security vulnerabilities as vendors put profit and speed to market before anything else, with cybersecurity very low down the list of priorities.

Retrofitting updates via the use of patches might work for a PC, a laptop or even a smartphone, but there are huge swathes of devices — and even whole internet-connected industrial or urban facilities — for which being shutdown in order to install and update is impossible.

“The security industry to date is predicated on the benefit of the retrofit. IT has designed insecure systems then we’ve secured them. That’s kind of OK in a world where a device can have some downtime,” says Thomson.

“But a car, a building, a city, a pipeline, a nuclear power facility can’t tolerate downtime. So if we don’t build security and privacy in to our designs from the very first whiteboard, we’re going to leave ourselves with a problem.”

Not only that, but as IoT devices become more and more common, people will start to ignore them

“The reality of the human mind is as we embed things, we tend to forget about them, we get complacent about them. Many of you are probably wearing a smart device on your wrist to monitor your behaviour and exercise routines. But no doubt two weeks after you started wearing it, you forgot it was there,” he says.

“The danger from a psychological perspective is that people forget about that technology and forget about the risks associated with it and our own personal mitigation of that risk.”

Even now, consumers are too blasé about connected devices, keen to jump on the latest technological trends failing to realise the associated security risks. Then even if they do, they remain unclear on how to secure the IoT devices — that is, if there is the option of securing it in the first place.

“Nobody reads the manual, especially to page 85 where it says how to change the default credentials, or page 90 where it says how to set up user accounts and restrict access to the admin interface, or page 100 where it says how to segment your network,” says Hyppönen.

He likens it to the “exact same problem we had in the 80s” when people wouldn’t even bother to set a time on their video recorder as it involved picking up the manual, so it’d end up always flashing 12:00.

It’s therefore important for the Internet of Things cybersecurity loopholes to be shut sooner rather than later so as to avoid nightmare scenarios where hackers could exploit vulnerabilities to attack anything from pacemakers and other medical devices, to connected cars to even entire industrial facilities.

But are IoT device manufacturers going to do this anytime soon? Probably not.

“The manufacturers of IoT devices are unlikely to fix this by themselves. They’re unlikely to start investing more money in their IoT devices for security because money is the most important thing in home appliances,” says Hyppönen

“When you buy a washing machine, price is the most important selling point. Nobody’s asking, ‘does it have a firewall or intrusion prevention systems?’ Cybersecurity isn’t a selling point for a washing machine, so why would manufacturers invest money in it?” he adds.

It might eventually be regulation which has to fix this problem; as Hyppönen points out, device safety is already regulated. “When you buy a washing machine, it must not short circuit and catch fire, we regulate that. Maybe we should regulate security,” he says.


The internet of botnets and ransomware on your TV: Here come your next big security headaches

National Cyber Security Centre and National Crime Agency warn more must be done to secure critical service from threat of IoT hacks.

Cyberattacks exploiting the insecurity of the Internet of Things,and hackers attempting to compromise industrial connected devices are among the biggest threats to the UK, those responsible for ensuring national security have warned.

Citing incidents including the internet crippling Mirai botnet cyberattack and vulnerabilities in a children’s doll which could potentially be exploited to conduct espionage on unsuspecting victims, a new report by the intelligence services has warned that the rise of IoT devices is providing threat actors with more opportunities to attack targets than ever before.

The joint report from the National Cyber Security Centre (NCSC) and the National Crime Agency (NCA), titled The cyber threat to UK business, details the growing threats to individuals and organisations from cyberattacks.

Noting how many IoT devices are shipped with insecurities which make them vulnerable to remote takeover — and without means to update or otherwise fix the devices — the report warns about the increased threat of IoT botnet attacks and says this form of cyberattack is going to get more frequent and more damaging in future.

If attackers continue to turn their efforts towards attacking industrial connected devices, then it could have potentially devastating consequences. In a worst case, hackers could turn off infrastructure such as electricity, water, or heating by hijacking or overwhelming insecure IoT devices.

The NCSC/NCA report warns that “sufficient safeguards are still not in place to protect these systems that were never designed to connect to the internet”, which could ultimately result in damaging real-world consequences.

The National Cyber Security Centre cites a cyberattack in Finland, where a DDoS attack disabled residential automated heating systems in apartment blocks for more than a week.

The report also warns on the increasing threat posed by ransomware, which has risen to become one of the biggest threats on the internet.
Citing ransomware-as-a-service schemes on the dark web which allow almost anyone to become a cybercriminal, the report warns how ransomware allows “individuals and groups to have an impact disproportionate to their technical skill”, especially as those carrying out the attack are increasingly targeting businesses.

Cybercriminals are already targeting smartphones with ransomware, but the report warns how 2017 will see hackers attempt to lock down other types of mobile devices including fitness trackers and TVs.

While the information stored on these is unlikely to be worth much for anyone looking to sell it on the digital underground, the report predicts that “the device and data will be sufficiently valuable to the victim that they will be willing to pay for it”.

“Cyberattacks will continue to evolve, which is why the country must work together at pace to deliver hard outcomes and ground-breaking innovation to reduce the cyber threat to critical services and deter would-be attackers,” said Ciaran Martin, CEO of the NCSC, speaking ahead of the agency’s CYBER UK conference in Liverpool.

Nonetheless, the NCSC — part of the GCHQ intelligence service — believes that IoT security is likely to “eventually” improve, but the government needs to play a role in ensuring these devices are secured.

“Government also has a part to play in promoting smart device security and helping to develop standards such as the NCSC’s and the Department for Business, Energy & Industrial Strategy’s work to ensure the Smart Metering System has proportionate security measures in place,” says the report.

However, the threat is set to loom large for the immediate future, thanks to the millions of insecure smart devices which are already connected to the internet — especially as millions more will be connected in the years to come, the report warns.

“Malware authors will continue to exploit them to mount attacks and will continue working to find fresh vulnerabilities. The ‘botnet of things’ will present a serious challenge to cybersecurity for a considerable time to come,” the report says.

The release of the NCSC/NCA report comes shortly after tech industry body the Online Trust Alliance (OTA) issued a rallying cry for vendors, retailers, and users to act together to “avoid digital disaster” caused by insecure IoT devices.


Cisco: First-gen internet is ‘not fit for purpose’ for IoT

Connecting things and machines will require a ‘re-engineering’ of the architecture of the internet to transform it into its second generation, Cisco’s CTO for ANZ has said, with edge computing a requirement for latency.

The internet as it currently stands is “not fit for purpose” to engage the connectivity of things and machines, according to Cisco ANZ CTO Kevin Bloch, as mobility requirements will be subverted by the Internet of Things (IoT) and a complete re-engineering will need to take place.

“As we move to more machines being connected than humans, the first generation of the internet is not fit for purpose for the second generation, which is going to be calibrated far more by the Internet of Things,” Bloch told ZDNet.

Calling 4G, 4.5G, and 5G “the same damn thing”, he said humans will still want the same kind of connectivity, albeit faster, cheaper, and with more data — but on the other hand, IoT requires the exact opposite of current mobile technology connectivity in every way.

“When you’re looking at the Internet of Things, you’re actually doing the reverse of that: In many cases, you’ve got no power, you can’t charge batteries for 10 years; in many cases, you’re far away from the transmitter-receiver, like on a farm, so you need to think lower frequencies not higher frequencies,” he said.

“So when we start talking about the internet gen-one being not fit for purpose, that’s what we’re talking about. There’s a need to rethink the engineering end to end of the internet in order to support things.”

Cisco is in a “really strong position” for this re-engineering project, Bloch said, with the latency requirements of IoT also requiring edge computing during the shift to second-generation internet — which means cloud will also change.

“In terms of that re-engineering, it’s not just the connectivity piece; it’s also latency,” he explained.

“You need to have actually intelligence distributed at the edge so that you’re circumventing the latency problem. And this is why I’m saying it’s the end of cloud as we know it, because you’re going to be pushing intelligence to that edge, and that edge over time is going to look a lot more powerful. It’s going to be looking like an edged cloud.”

Cisco SVP of Enterprise Infrastructure and Solutions Jeff Reed also addressed the importance of edge computing for IoT during his Cisco Live Melbourne technology keynote on Thursday morning.

“The edge is so critical as part of this evolution. And what I’m seeing is that in the IoT world … it’s not just connectivity. It’s not just networking. It’s how I think about security at the edge. It’s how I think about compute at the edge,” Reed said.

“We make these applications for IoT that require processing close to the things due to latency, due to bandwidth requirements or lack thereof. So as you kind of embark upon this IoT world, the services and capabilities at the edge of your network are going to be more and more critical.”

IoT will also require a fundamental shift in the basic mobile business model, with Bloch repeating his warning of the possibility that telcos may not make money through IoT.

“Be careful if you’re a telco. We might be saying there’s going to be 50 billion things connected … but they’re going to expect connectivity for free,” Bloch told ZDNet, pointing towards the low cost of devices and connectivity that are required in such IoT products as the CSIRO’s RFID bee backpacks project.

Bloch pointed towards the fact that Cisco has IoT partnerships in both unlicensed and licensed spectrum with NNN Co and Telstra, respectively, because vendors need to look at “the right application for the right use case”.

However, he said the future may lie in unlicensed spectrum.

“We’ll be using things like free spectrum, and again, telcos need to think about that, because at the end of the day, how you monetise the Internet of Things, it’s not the same as if we look back in time how they monetised mobile phones,” he said.

“That model’s there for humans, but it’s not going to cut it for the machine world.”

Disclosure: Corinne Reichert travelled to Cisco Live in Melbourne as a guest of Cisco


The Internet of Weaponized Things: Time to tackle device security before it’s too late

Vendors, retailers, and users need to act together to “avoid digital disaster” caused by insecure IoT devices, warns a tech security group.

Failing to tackle the insecurity of the Internet of Things and connected devices could lead to the technology’s weaponization, resulting in irreversible consequences for us all.

Incidents including the Mirai IoT botnet cyberattack and stuffed toys being found to leak unsecured personal information onto the open internet have demonstrated how security around the Internet of Things is still immature, despite how these devices are proliferating in our homes and workplaces.

In order to prevent the long-term consequences of insecure IoT devices resulting in property damage, theft, or even physical harm to people, tech industry body the Online Trust Alliance (OTA) has called on the technology sector, businesses and industry, the government and consumers to come together to “avoid digital disaster”.

The call comes in the OTA’s new report, Securing the Internet of Things: A Collaborative & Shared Responsibility, which warns how “too many IoT devices appear to be designed primarily for convenience and functionality while long-term security is conspicuously absent”.

The report comes shortly after the World Economic Forum warned about the potential threats to society from IoT hacking.

Likening the risk to global warming or industrial pollution, the OTA warns that there will be long-term consequences resulting from failure to deal with IoT threats and that lack of action has already “created a treasure chest ripe for abuse by white collar criminals, terrorists, and state-sponsored actors as IoT devices become weaponized”.

“All too many connected devices sold, ranging from automobiles and thermostats to children’s toys and fitness devices, have insecure remote access and controls. By default many collect vast amounts of personal and sensitive information which may be shared and traded on the open market,” warns the report, which notes how many of these devices don’t have the functionality to remove personal data if they are sold or lent out.

As voice-enabled devices like Amazon Echo and Google Home take off, a lack of sufficient user authentication on these assistants could be exploited the report warns, as demonstrated by incidents where home assistants have bought items after hearing instructions from children or even voices on television.
If this sort of risk isn’t curbed, the OTA suggests that we could get to the point where people could issue commands to devices in the home or workplace by yelling through a window or leaving a message on an answer machine. If they ask devices to unlock the doors, outsiders could potentially walk right in.

“It does not take much imagination to realize the risk and impact of physical harm which could occur,” the report warns.

In order to combat this risk, the OTA calls for both the public and private sectors to work together to ensure security is built into Internet of Things devices as “all stakeholders bear a responsibility”. This includes retailers, developers, ISP providers, regulators, government, and consumers.

The OTA argues that the IoT shows a lot of promise, but in order to protect users, action is needed now to “maximize the security, privacy and vitality of all IoT devices”

“Acting now will help prevent and mitigate the risk of a digital disaster. We all have a role and responsibility to address security and privacy,” the report said.

It said retailers and resellers should help “in setting baseline security and privacy measures for the products they profit from”. Meanwhile developers and manufacturers should disclose their security support commitment to users prior to purchase and “clearly articulate their security and privacy policies”.

In addition, sellers of homes and cars should be encouraged to disclose all such devices and features, disable their access, and provide new owners the ability to re-set them, turn in their physical and digital keys, and remove all personal data.


Cloud-to-client, direct: serverless computing reduces the middle

‘Hybrid cloud isn’t going to be a mix of AWS and Google, or AWS and on-premise. It will be a mix of AWS and client machines.’

One of the buzzwords to emerge over the past year is that of “serverless” computing or architecture, which, as the term suggests, involves the provisioning of key information technology resources to users without the fuss and muss of acquiring and activating additional hardware, which not only means servers, but disk space as well. Let the cloud vendors worry about the messy details of protocols, security, resource provisioning, processor speeds, and memory allocation, and focus on the applications business users need to run their organizations.

Serverless is, for all intents and purposes, another name for Platform as a Service. There are vendor tools and environments suited for such a purpose, including Amazon Web Services Lambda, IBM BlueMix OpenWhisk, and Microsoft Azure Functions, Buzzwording aside, full-throttle adoption of serverless platforms may even stir rethinking of optimal hybrid cloud architectures, and what it means for IT teams to serve as brokers of needed business services.

That’s the experience of Gojko Adzic, a highly regarded thought leader in the IT space and partner at Neuri Consulting, who recently explored his journey down the serverless computing path with his MindMup project. In his post, Adzic provides some food for thought as to the best way to structure the delivery of cloud-based back-end services to a dynamic user base.

MindMup, which offers mind mapping tools, first piloted the AWS Lambda platform in February 2016, and moved entirely over to Lambda at the beginning of 2017, Adzic relates. The site has seen positive outcomes so far in its one-year journey, with a user base increasing by 50% while hosting costs have dropped 50%, he says. Plus, scaling to meet demand is now relatively painless.

Serverless computing is about a platform approach, not just services. Organizations thinking that moving applications to serverless platforms will save money will be disappointed, Adzic says. Even if an infrastructure is deployed across the resources of a cloud provider such as AWS, as it may involve making multiple duplicate payments for connecting web requests, he explains.”By far, the biggest lesson for me was to really embrace the platform, not just the service,” he relates.

Adopting a platform approach can be accomplished three ways: through the use of distributed authorization: letting clients orchestrate workflows: or allowing clients to directly connect to AWS resources, Adzic explains. MindMup went with the third, direct-to-client, approach, as the first two options have limitations within AWS environments. Enabling direct access to platform services has helped to reduce latency and costs.

As Adzic observes, this model of direct cloud-to-client architecture represents the most expedient way to deliver hybrid services, and he goes on to suggest that this may even change the way enterprises think about hybrid cloud architecture. That is, open up back-end services directly to clients and client applications, rather than structuring layers of services between users and cloud functions:

“When client applications can directly connect to ‘back-end’ resources, there’s very little benefit orchestrating that from anywhere else. Coordination, workflows and many other aspects of an application can move directly to the client application. Only the parts that really need to be locked down for security reasons or to use specialist resources need to go to AWS. The hybrid-cloud of the future isn’t going to be a mix of AWS and Google, or AWS and on-premise. It will be a mix of AWS and client machines.”


Ancient equipment and poor networking: BlackBerry weighs in on the state of medical device security

BlackBerry executive Nader Henein says that before security weaknesses can be addressed, healthcare providers need to take a long, hard look at their networks and rebuild them from the ground up.

BlackBerry executive Nader Henein believes that as more Internet of Things (IoT) and connected healthcare devices come into play, the industry must take steps to reevaluate their security — from the ground up.

The use of IoT and connected devices can provide healthcare providers and patients with a range of benefits — such as the use of mobile technology to track conditions, improved communication between departments, and personalized medical care — but it also comes with risk.

The problems lie in how medical devices become connected to the internet and the path this forges for cyberattackers to potentially exploit. Not only could attackers target victims individually as demonstrated by IOActive researcher Barnaby Jack’s experiments with pacemakers, but patient data can be stolen and medical devices can be left exposed online for attackers to cause havoc if they wished.

Despite these issues, business is booming, with analysts predicting the global IoT healthcare market will be worth as much as $410 billion by 2022.

Speaking to ZDNet, Nader Henein, regional director of advanced security assurance advisory at BlackBerry, said that security standards for biomedical devices, unlike PCs or mobile products, are still in their infancy.

While some patch programs do exist and some medical equipment makers — such as Hospira — are beginning to take responsibility for security, industry standards must be created to ensure at least a basic level of security for future devices.

However, this is not the only issue at fault.

“The focus is still almost entirely on the function of the device rather than its capacity to be secured,” Henein says. “As such the problem in the medical space is the same as with any traditionally isolated device that has become connected over the past few years: these devices are often “insecure” and worse even in many cases “un-securable.”
The true security of a medical device needs to be measured by a device’s capability to withstand a cyberattack from skilled hackers. This, in turn, needs to evolve over time, the executive says, in order to “keeping with the nature of cybercrime,” and devices should be constantly tested based on the latest industry threats.

While the US Food and Drug Administration has only reached the “recommendations” stage when it relates to medical device security, there is some independent industry movement, at least. In May 2016, DTSEC was released — a medical device cybersecurity standard created and managed by a BlackBerry-led non-profit consortium.

The standard focuses on embedded medical device security through systems implemented at the beginning of development cycles. By using other international standards, including ISO 15408 and IEC 62304 (.PDF), DTSEC acts as a guide which contains security requirements and recommendations for different product types.

See also: FDA one of many ‘toothless dragons’ with no will to tackle medical device security

It is both the responsibility of medical equipment vendors and hospitals to take note of how the cybersecurity landscape is evolving and what threats may be landing at their door. It is understandable that budgets are often tight and overstretched, but unless such entities want to entertain the risk of facilitating harm to a patient due to cyberattackers, investment needs to begin at the device level and end with network security, update schedules, and staff training to detect malicious threats.

“When improving cyber defenses in the healthcare industry, one of the first steps must be to properly re-engineer the network where they sit so that insecure devices are not a threat and their usage can be properly monitored,” the executive says. “In the long term, updating or replacing insecure devices should be part of the change management process.”

Read on: Medical device ‘birth certificates’ could solve healthcare security woes

According to the executive, not enough is being done by healthcare providers to secure their networks — especially when such investments hit the bank balance.

As the healthcare industry is trained and focused on providing patient care, research, and medical equipment, security is often left by the wayside — and there may be little left over out of budgets to shore up networks and protect devices connected to them.

“The issue is, if they want access to the latest and greatest in lifesaving medical equipment, they’re going to have to also focus on cybersecurity as almost all of these devices are connected,” Henein noted. “As healthcare providers learn more about the benefits of IoT, they will also become more aware of the associated threats. In turn, healthcare budgets should start to increase to reflect this increasing awareness over time, but this is not going to happen overnight.”


System Requirements

Both OsMonitor Server and Client can work on Windows XP, Windows Server 2003/08/12/2016, Windows 7, Windows 8/8.1, Windows 10. Include 32 bit and 64 bit.

Customer Review

We are now using your monitoring software, OsMonitor. It is a great software, we are able to block non-business website, monitor activities of our users, website visited and even snap shots. Majority of our need is provided by your software.