Network computer monitoring

Monthly archives for April, 2017

Meet Chris Vickery, the internet’s data breach hunter

His job is simple: Find leaked and exposed data before the bad guys do.

NEW YORK — It’s a phone call you hope never comes in: Chris Vickery has found your company’s entire set of customer data on the
internet.

He sits at his desk, littered with external hard drives storing terabytes of data, in his home office in Santa Rosa, Calif., where
he scours the internet for data that shouldn’t be accessible — a phone number, a social security number, or credit card data —
sitting in databases that aren’t password-protected for anyone to access.

Using search engines for internet-connected devices, like Shodan, and tools that scan common ports where data typically live,
Vickery can tick off hundreds of internet addresses and their ports for leaky databases, badly configured backup drives, and other
inappropriately stored data.

It’s a race to find accidentally exposed data before the bad guys do.

But it’s a time-consuming and technical job that takes requires focus, patience, and the temperament to accept failure and to know
when to call it a day.

Like others in the security research space, it also requires working strictly ethically and within the lines of the law. When
Vickery finds an exposed database, he goes through a process of responsible disclosure — usually as simple as privately informing
the company of its mistake — in the hope that it can seal the leak before a criminal can steal the data.
Only when the data is safe does he blog about his findings so that readers can learn from others’ mistakes. “It’s kinda like a
treasure hunt,” he told me on the phone last week.

Vickery, a softly spoken southerner, isn’t driven by money or reputation — though the latter has become an occupational hazard of
his blogging.

Through his blog, Vickery is one of a handful of security researchers in recent years who have sparked more headlines than almost
any other person, and yet he isn’t a household name. His work has resulted in protecting the personal information and privacy of
tens of millions of people.

In the past few years, Vickery has found sensitive data from hotel chains, a massive financial crime and terrorism database,
several breaches of health data, leaked data from a dating app for HIV positive people, a publicly stored trove of voter
registrations on 93 million Mexicans, a law firm’s files that cast doubt on the official report into an inmate’s death, and an
leaky airport server that stored highly sensitive TSA files — to name just a few.

His work for the past couple of years has been associated with Kromtech, the maker of MacKeeper, a some-might-say controversial
utility software for Apple desktops that has been fraught with complaints and concerns — the company has rebuffed — in part
because of its perceived pushy advertising tactics and aggressive affiliates.

It’s fitting that it was a data breach that brought him to the company, after he found 13 million accounts in its unprotected
database.

As of Monday, Vickery started a new full-time role at UpGuard, a cybersecurity startup, which last year raised $17 million in
financing, pinned on its core product, a cybersecurity grading system.

The Mountain View, Calif.-based company’s flagship product is a credit-style score for cybersecurity, which determines a company’s
cyber-risk factors by scanning its internal network and systems and spitting out a report on where it can improve. UpGuard also has
a free web-based tool that lets anyone run a scan on any company’s external network (such as a website and subdomains) to measure
its security posture.

The company’s co-founder and co-chief executive, Mike Baukes, said on the phone last week that Vickery’s name “kept coming up” in
the discovery of data breaches.

“Our capability isn’t just about developing products that helps fix issues that Chris finds,” said Baukes. “It’s also about
elevating these issues to the right places and raising the industry’s awareness,” he said, arguing that many cybersecurity products
have an “inability to translate the issues properly” and leave “people in the dark” about what they need to do next.

“We share a similar belief system,” said Baukes, calling Vickery’s work “deeply honorable.”

Vickery’s work began back in his native Austin, Texas while working his former day job as an IT technician at a law firm. What was
initially an academic curiosity about security and data protection slowly evolved amid greater fascination with security into a
full-time passion.

One smaller data exposure led to another, where he later recognized during those formative early days that there were huge swathes
of data if you knew where to look.

He jumped down the rabbit hole of data breach discovery and hasn’t turned back.

Now, Vickery is seen by many — reporters and fellow security researchers alike — as the master of the internet’s lost-and-found
department. He’s driven by a desire to return this leaked and misplaced information to its rightful owner. Guided by a strict set
of mostly self-imposed moral guidelines that dictate how he works, his process from discovery to disclosure relies almost entirely
on reaching out in good faith to the unwitting companies that — often through carelessness — have leaked the information their
customers trusted them with, and he asks them to come clean.

“If the companies that I inform respond well and fix things and don’t just ignore me and think I’m trying to take advantage of them
somehow. And if they do notify the affected people, secure it quickly, and are open about it — and they’re not trying to demonize
me — that’s a good day,” he said.

“A lot of the time those elements don’t come together,” he explained.

But not everyone appreciates what he does.

Few want to be told that they have committed a fundamentally basic but catastrophic security error. All too often, though,
Vickery’s act as good samaritan is met with hostility — or worse, he’s used as a scapegoat when companies seek to shift the blame
to the work of a “hacker.”

“It’s extremely frustrating when companies don’t take responsibility for breaches,” he said. “But it’s a natural human response for
some — a knee-jerk response,” he said, to blame the person who found the data rather than their own shoddy security.

Vickery is not a hacker, but the law covering security research and breach discovery is far from simple, thanks to the old and
antiquated Computer Fraud and Abuse Act (CFAA) — persistently reamed by critics as a barrier to security research for its
overbroad terms and definitions.

The law says where hackers must gain “unauthorized access” to a server to fall foul of the law, such as using or cracking a
password that stops anyone getting in, the data that Vickery finds is never protected in the first place.

Arguably, his discoveries are no different from how ordinary internet users browse the web.

“Browsing is requesting files from a directory on a web server and displaying them onto your screen. Every time you visit Amazon,
you’re downloading files from Amazon’s servers. That’s exactly what I’m doing,” he said.

“If what I’m doing is illegal, then browsing any web page is illegal,” he said.

The CFAA has been ridiculed and scoffed at. The law, for instance, makes it illegal to share your Netflix login with someone else
— or even your social media account, effectively making any social media team of any leading brand at risk of violating federal
hacking laws.

Congress has tried to fix the law but to no avail, and it remains a serious threat to security researchers and their work.

But just last month, Vickery was named in a lawsuit against River City Media, in which the company, accused of being a top spammer,
exposed its own systems by failing to use a password on a backup drive. The lawsuit accuses Vickery of being a “vigilante black-hat
hacker,” though no government agency has ever brought charges of their own.

“They have made up a lot of things I’m certain they can’t prove,” he said in response to the complaint. “Certain people will always
try and defer blame,” he said. “What is a profit-minded corporate guy going to do — potentially give up millions of dollars in
fines or say that this one guy hacked me? It’s a clear decision on their side. The best leaders and companies will accept
responsibility in a situation — but bad businesses, they tend to focus on ‘shooting the messenger’.”

I asked whether the lawsuit, if successful, could have a chilling effect on security research — or even for reporters, like
myself, who cover data breaches, leaks, and exposures.

“If they can make up and fabricate events and have a jury believe them — well that’s going to have a far greater effect than
chilling researchers and data breach reporting,” he said.

“That means the entire system is broken,” he added.

It doesn’t seem that Vickery will back out of this line of work anytime soon. He’s a man on a mission, and given his already hectic
work-life balance, he admits that he far exceeds the nine-to-five confines of most corporate jobs. It’s something he loves — and a
necessity for the next wave of Americans whose data he wants to try to protect.

But it’s a hostile world and he, like the rest of the security community, faces the persistent threat of undue hostility from the
corporate world, sans a landmark decision — in his words — that would change the face of computer law enforcement goes. And that
case could, if it escalates, put Vickery at the forefront of that law change — for better or for worse. It makes you wonder why
someone would put themselves in the line of legal fire.

“Somebody has to do it,” he said. “And I feel a duty to keep carry on doing what I do.”

From:http://www.zdnet.com/article/chris-vickery-data-breach-hunter/

Open-source EdgeX Foundry seeks to standardize Internet of Things

Fifty companies have joined up to unify Internet of Things edge-computing programming.

Security is the Internet of Things’ (IoT) Achilles heel. One reason that’s so is there is a lack of common IoT development standards. The Linux Foundation, along with 50 companies, is addressing this by building a common open-framework for IoT edge computing and an ecosystem of interoperable components under a new open-source consortium: The EdgeX Foundry.

The new initiative has a common goal: The simplification and standardization of Industrial IoT edge computing, while still allowing room for vendors to add their own value-add features.

True, IoT is already booming as a business, but widespread fragmentation and the lack of a common IoT solution framework are hindering its broad adoption and stalling market growth. In addition, crooks are already breaking into IoT devices with cracking tools such as the Metasploit hacking kit.

This complexity and IoT’s wide variety of components is creating paralysis. EdgeX will attempt to solve this by making it easy to quickly create IoT edge solutions that have the flexibility to adapt to changing business needs.

The EdgeX Foundry will try to unify the marketplace around a common open framework and building an ecosystem of companies offering interoperable plug-and-play components. These will be designed to run on any hardware or operating system and with any combination of application environments. With flexibility, EdgeX will also help deliver interoperability between connected devices, applications, and services across a wide range of use cases. A certification program will ensure interoperability between community-developed programs.

That’s easier said than done, but the initial work is already in place. Dell is seeding EdgeX Foundry with its early stage FUSE source code base under Apache 2.0. FUSE forms a layer that will sit between the many different messaging protocols used by today’s sensor networks and the cloud and server layers.

The contribution comprises more than a dozen microservices and over 125,000 lines of code. It’s designed to to facilitate interoperability between existing connectivity standards and commercial value-add such as edge analytics, security, system management, and services. Other EdgeX members are already adding code.

They’re doing this because, Philip DesAutels, The Linux Foundation’s Senior Director of IoT, explained, “Businesses currently have to invest a lot of time and energy into developing their own edge computing solutions, before they can even deploy IoT solutions to address business challenges. EdgeX will foster an ecosystem of interoperable components from a variety of vendors, so that resources can be spent on driving business value instead of combining and integrating IoT components.”

This may sound to you a bit like AllJoyn. It’s not. AllJoyn is a open-source protocol of device-to-device communications. EdgeX is a framework for building IoT edge software and firmware that connects via the internet to the cloud.

The EdgeX members are adopting an open-source edge software platform because it will help everyone in the IoT business world.

End customers can deploy IoT edge solutions quickly and easily with the flexibility to dynamically adapt to changing business needs.
Hardware Manufacturers can scale faster with an interoperable partner ecosystem and more robust security and system management.
Independent Software Vendors can enjoy interoperability with third-party applications and hardware without reinventing connectivity.
Sensor/Device Makers can write an application-level device driver with a selected protocol once using the SDK and get pull from all solution providers.
System Integrators can get to market faster with plug-and-play ingredients combined with their own proprietary inventions.
EdgeX’s membership includes numerous IoT movers and shakers. Founding members include: Advanced Micro Devices (AMD), Bayshore Networks, Canonical, Dell, Linaro, NetFoundry, and VMware. Industry affiliate members include: Cloud Foundry Foundation, EnOcean Alliance, Mainflux, Object Management Group, Project Haystack and ULE Alliance.

I hope it’s successful. The current mis-mash of incompatible, one-off technologies is a recipe for incompatibility and insecurity. An interoperable, open-source based IoT world will be far safer and better for both vendors and consumers.

From:http://www.zdnet.com/article/open-source-edgex-foundry-seeks-to-standardize-internet-of-things/

Snowflake Computing raises $100 million to expand cloud data warehouse footprint

Snowflake Computing, a cloud data warehouse player led by former Microsoft exec Bob Muglia, raised money to expand its engineering team and European footprint. We talked shop with Muglia.

Snowflake Computing, a cloud data warehouse vendor, has closed a $100 million funding round as it aims to expand internationally.

The company, built on Amazon Web Services, was founded in 2012 and has raised $205 million so far. The funding round was led by ICONIQ Capital and Madrona Venture Group and included its initial venture partners.

Snowflake said it will build out its engineering team as it aims to take more on-premise data warehouse workloads. CEO Bob Muglia, a former Microsoft exec, said the funding will help Snowflake scale. I caught up with Muglia to talk shop. Among the key themes:

Building on Amazon Web Services. Snowflake’s cloud data warehouse service is built on AWS, which also competes via Redshift and other offerings. “AWS has treated us well. We compete with them obviously, but they are great partners,” said Muglia. Translation: Netflix is built on AWS and competes with Amazon. Snowflake is a data warehouse spin on that theme.

Also: Will Snowflake spark a cloud data warehouse price war?

Where are we in the cloud data warehouse adoption curve? Muglia said data warehouses in the cloud are still in the early adoption phase internationally, but have hit an inflection point in the US. Part of Snowflake’s latest funding round will be devoted to growing in Europe, which has data sovereignty laws. “In Europe there is a lot of interest,” said Muglia. When Snowflake first started, the company sold to early adopters in the media, ad tech and entertainment industry. “We were competing against Redshift primarily because it was the early leader,” explained Muglia. “But the world has changed and mainstream companies are moving toward cloud adoption and they are looking for solutions that work with structured, unstructured, and transactional business data.”

As a result, Snowflake is increasingly competing with Oracle, IBM’s Netezza, and Teradata, said Muglia. Snowflake’s architecture is built to manage traditional data warehouse workloads as well as unstructured data. “We built from the ground up for the cloud,” said Muglia.

Typically, a customer has an existing data warehouse and starts with Snowflake and either uses the cloud for new workloads or eventually migrates.

Gartner puts Snowflake into the niche market in its data management Magic Quadrant.

Hadoop. Snowflake is often landing customers that have become disillusioned with Hadoop. “Hadoop is an easy target because no one is happy and it’s hard to make Hadoop work. It’s a science project basically that’s a multi-month, multi-year solution with custom SI (systems integrator) work,” said Muglia. “If you’re Netflix you can make Hadoop, but there are maybe 10 to 20 companies that are proficient.”

The Internet of things. Muglia said that IoT creates a lot of “semi-structured” data and is complicated to manage with a structured approach. That situation is good for Snowflake. The next version of LTE will also create a surge in IoT data. Snowflake is engaging the public sector for smart city and other municipal deployments.

From:http://www.zdnet.com/article/snowflake-computing-raises-100-million-to-expand-cloud-data-warehouse-footprint/

Microsoft ‘Project Sopris’ takes aim at securing low-cost IoT devices

A new Microsoft Research team, Project Sopris, is looking to redesign microcontrollers in the name of making low-cost IoT devices more secure.

Microsoft researchers are working on a new project aimed at trying to secure low-cost Internet of Things (IoT) devices.

The Project Sopris team is “exploring the goal of security the vast number of low cost internet connected devices coming online,” says the research page for the Sopris project, which was officially established March 31, 2017.

“As part of this research work, we have tested different approaches to device security from silicon to software and hypothesize that optimal device security must be rooted in hardware but kept up-to-date through evolving software,” explain the researchers.

Among those working on Sopris are partner research manager Galen Hunt; principal researcher Ed Nightingale; and senior hardware architect George Letey. As unearthed by “The Walking Cat” (@h0x0d on Twitter), senior director of silicon and system architecture Rob Shearer also seems to be on the team.

Hunt has been a key member of a number of previous significant Microsoft OS research projects, including Singularity, Drawbridge, and Menlo.

The Sopris team has published its first technical report, titled “The Seven Properties of Highly Secure Devices.”

That paper notes that the Sopris researchers are paying special attention to the “tens of billions of devices powered by microcontrollers,” as they are not prepared for the security challenges posed by internet connectivity.

The Sopris team is working silicon partner MediaTek to revise one of their controllers — the the Wi-Fi-enabled MT7687 — to create a prototype of a highly secure microcontroller.
Microsoft is looking to have security researchers test the Sopris security kit via the Project Sopris Challenge. The application period for the challenge closes April 14. Microsoft is offering bounties from $2,500 to $15,000 for submissions of eligible security vulnerabilities found in its early research prototype.

Early findings indicate that “even the most price-sensitive devices should be redesigned to achieve the high levels of device security critical to society’s safety,” the researchers say.

From:http://www.zdnet.com/article/microsoft-project-sopris-takes-aim-at-securing-low-cost-iot-devices/

Building my own Internet of Things ambient experience, one step at a time

How a lightbulb can become another way of getting the information you want, when you want it.

First, a confession. I always wanted one of those internet-connected rabbits.

You know the ones: they’d glow yellow if it was sunny, drop an ear if the NASDAQ had conniptions, and make burbling noises when a friend was trying to Skype you and you’d turned off all your speakers.

What they were was an early example of an ambient user interface, using the things around us to tell us about the things we can’t see. A stock market movement waving a rabbit ear might seem odd at first, but it’s a simple signal that encodes complex information. Catching that movement out the corner of your eye gives you the opportunity to delve into richer tools and get the information that triggered the ambient action.

Buying an off-the-shelf rabbit puts you at the risk of losing service, as all those devices eventually did. So is there a more open alternative, one that’s less self-consciously cute, and one that’s ready for the modern API-driven world?

As I’ve noted in other posts, I’ve been experimenting with retro-fitting smart devices into a Victorian London house. It’s a challenge: the building isn’t designed for technology beyond a gas lamp and a coal fire, so anything I fit has to be wirelessly connected to my office network.

A recent switch to a set of Netgear Orbi wireless mesh access points has made that a lot easier, with fewer, more powerful wireless routers looking like a single node to my devices. Orbi’s smarter approach to wireless networking means it’s also a better neighbor to some of the low-power Wi-Fi devices that make up some of my personal ‘intranet of things’.

I’ve already written about the Arlo and Ring cameras over the front door, and they’ve recently been joined by a Ring Video Doorbell. Between them they’ve helped me monitor a package that had been left on the doorstep while I was on the other side of London, and provide entertaining wildlife footage of the local urban foxes’ 2 am antics (see my video below).

They’re only part of my home’s sensor network: a Nest Protect smoke detector also provides CO monitoring, while a Netatmo home weather station collects data about temperature, rain, and wind from the roof of the house. It also provides tools for monitoring local pollutant levels, a useful feature for living in a city like London.
However the thing about all these devices is that they’re all very much their own thing, with their own apps on my phone and their tabs in my browser. So how can I bring them together, and how can I see what they’re telling me without having to look at my phone?

The answer was inspired by that wireless rabbit. What if I built some sort of ambient information device on my desk that could quickly show me what was going on in my world, giving me the facts I need when I need them?

Philips had recently sent me a set of their Hue bulbs — colour-changing LED bulbs with a wireless hub that connected them to my home network. I put one in an old desk lamp, in the corner of my crowded desk, and started looking at how I could connect it to a range of different services.

While the Hue hub has a local set of RESTful APIs, it’s been designed so only a limited number of trusted partners get access to Philips’ cloud-hosted Hue service. One option, then, was to use a Raspberry Pi running Node-Red as a personal IoT hub, linking various cloud-hosted APIs and webhooks to a freely available module that could control my lights. It was certainly an attractive option, but was there a simpler way yet?

Luckily there was. One of Philip’s trusted Hue partners is the personal no-code API connection service IFTTT. If This Then That uses simple API-driven triggers to connect one service to another, and you can build a personal library of services that won’t run until triggered.

With Ring and Netatmo as IFTTT inputs, and with Hue as an output, I could target that desk lamp, changing its colour in response to external events. Ring my doorbell, and the light goes blue. Is it raining? The light is green. That gives me a very simple set of tools for driving my lights from my various IoT devices.

IFTTT is useful, but it’s limited. It’s no-code approach limits you to known triggers and known outputs. You can’t make decisions based on the content of input data, and you’re limited to services where IFTTT has existing partnerships.

However the answer to the problem is hidden away in the ever growing list of services that IFTTT supports: its Maker Webhooks. Intended for users building their own IoT hardware, IFTTT’s webhooks use a common web standard to give you a set of open triggers that are tied to your IFTTT account. All you need to do is create a webhook, and copy its URL with your personal authentication key. You’ll notice that the URL has an {event} field. This is where you define the trigger your IFTTT applet is going to respond to. You’ll also see there’s the option of delivering a JSON payload from whatever triggers your webhook.

With support for multiple {event}s, you’re able to build applets that connect to your Hue for a range of different possible outcomes, tying them to colours, to patterns, or just to blink the light. Things get interesting though when you take advantage of the payload, as now you can send a colour from another application to your Hue bulb.

That made me wonder: could I connect my IFTTT applet to an output from a more information oriented service like Microsoft’s Flow?

The answer was, unsurprisingly, yes. Flow is able to send signals to a webhook, so all I needed to do was configure the appropriate URL for my IFTTT Maker Webhook, and hook my Flow output to an input. Flow lets you build much more complex interactions than IFTTT, with tools for handling basic conditional workflows and stacking outputs, so one input can have multiple outcomes. It’s also focused on working with enterprise data sources, plugging in to Microsoft’s and other cloud services.

Now, while I could have used a Flow to deliver a colour that encoded sales data from Dynamics, or my number of unread messages in Outlook from Office 365, I actually just used Flow’s click button to send a signal to IFTTT that would turn the light on.

It was simple, it was relatively uncomplicated, and, best of all, it worked.

I’d connected two different no-code services together, taking advantage of their strengths to do something that one alone couldn’t do. I was also a long way on my road to building an ambient information system out of my desk lamp. All I need now is a list of my inputs and I’ll be ready to go.

From:http://www.zdnet.com/article/building-my-own-internet-of-things-ambient-experience-one-step-at-a-time/

The dangers of the public internet

There are three types of attacks: ones that attack the confidentiality of data, ones that attack the integrity of data and ones that attack the availability of systems

The Internet is under attack.

It has been for many years, ever since “hacker” and “malware” first crept into our vocabulary. But, the internet has grown exponentially since those days. It was never meant to handle the level of data it traffics today, a level that exceeded 1 zettabyte last year.

The internet was originally built just to share files between users. The fact that it has grown into the massive web of data and endpoints we enjoy now — one where smartphones, tablets and smart TVs will account for nearly 70 percent of Internet traffic by 2019 — is an enormous convenience to how we work, communicate and live.

[ An InfoWorld exclusive: Go inside a security operations center. | Discover how to secure your systems with InfoWorld’s Security Report newsletter. ]
But, that evolution has been enormously convenient to cyberattackers as well, whose methods for breaching, infecting and stealing data and bringing down networks has now turned the internet and its billions of users into a digital dartboard, constantly under assault.

The CIA of data

Data breaches, DDoS (distributed denial of service) attacks, brute force decryption attacks — these are just some of the more prominent examples of cyber and network security attacks we’ve seen grow in scale and frequency over the years. These cyberattacks come in three distinct types: ones that attack the confidentiality of data, ones that attack the integrity of data and ones that attack the availability of systems — CIA, for short.

Target, Sony, OPM, IRS and Snapchat are just a handful of recent C and I attacks that have made some of the biggest headlines, striking at the confidentiality or integrity of millions of personal records. But, the A-attacks like a DDoS attack that takes aim at service uptime are perhaps even bigger threats, bringing networks and cloud platforms offline for extended periods of time, disrupting business continuity, and even holding network availability and data for ransom, with an average price tag of $620,000 for enterprises. Even just the threat of a DDoS attack can wreak havoc.

You don’t even need to be an intended target of these attackers to feel their impact. When a DDoS attack travels down the same line that your traffic is going, it still disrupts or shuts down your service all the same. Your traffic is moving along the same line as the target, and becomes collateral damage.

Internet apathy

The common thread through these CIA attacks is the public internet. As long as enterprises and end users are using the public internet for transmitting and storing data, they will always be putting themselves at risk.

Enterprises, SaaS providers and cloud services don’t have to operate over the public internet, but they choose to because it has always made sense economically. The alternative has always been costlier and more complex.

But, why then, do they still accept all of those risks that are becoming more serious every year? Because the public internet is “good enough.” It works most of the time already, and when you’ve been used to it for 25 years, why rock the boat?

Enterprises and SaaS providers only start to reevaluate their reliance on the public internet until they have a bad experience — a network breach, a disrupted conference call, something that negatively impacts a business-critical service. By then, it’s too late, assuming they even treat it like a wake-up call at all.

What’s especially troubling is that these same enterprises are now relying on the cloud for their mission-critical apps. That means moving some of their most critical processes to a place that can only be accessed through a pipe that is notoriously unreliable and unsafe.

That’s just bad business sense. But, because these problems only occur intermittently, they’re not thought of as serious concerns ahead of time. More than that, many enterprises simply don’t have the network savvy to properly diagnose why they were attacked in the first place, and how the public internet itself is ultimately culpable.

Looking outside the box

The public internet is the root problem, and any solution that ignores that will only be attacking the symptoms, not the cause. That’s why enterprises and their partners need to look outside of the box — in this case, the internet itself — to find their way out from the dangers of the public Internet that will only become more dangerous as time goes on.

Solutions that can provide private, reliable connections outside of the internet, such as interconnections and VPNs, provide a new way for organizations to network without having to worry about their data being suddenly impeded, stolen or shut down by an attacker. So long as enterprises aren’t thinking about how to move their operations and data traffic around the public pathways of the internet, they will find themselves under the constant threat of the next CIA attacks just around the corner.

From:http://www.infoworld.com/article/3172730/security/the-dangers-of-the-public-internet.html

For internet privacy, a VPN won’t save you

In theory, getting a VPN is good advice. But the technology hasn’t caught up to modern standards yet, and some “safe” services could put you at greater risk.

Last week, Congress voted to gut proposed internet privacy rules set out by the outgoing Obama administration that would have prevented your internet provider from selling your browser history to advertisers. President Donald Trump signed the bill a day after, making it law.

Many turned to what appeared to be an obvious solution: A virtual private network (VPN).

The idea of using a VPN is simple enough. The good ones are designed to push your internet traffic through a protected and secured tunnel, which shields your browsing records — such as the websites you view — from your internet provider. (As a result, some VPNs push your internet traffic through servers in other countries to trick content providers, like Netflix, into thinking you’re in a different place — usually in order to gain access to content in other geographies.)

But VPNs, for the most part, are lousy, often over capacity, and almost always significantly reduce your internet speeds. And, sometimes services simply don’t work or load because they can detect you’re using a VPN, forcing you to jump off the VPN — effectively defeating the point of using the service on a long-term basis.

And, a lot of the time, the bad ones won’t protect your privacy as they promise.

Some services are better than others. We’re not here to tell you the best ones or pick sides, but there are some pointers to note from our sister-site CNET and here on ZDNet. For example, paid services are usually better at hiding your traffic than free services where the customer is usually the product.

But what compounds the problem is that some phony VPN services promise to protect your privacy, but they don’t and are simply cashing in on the news, said Motherboard.

The big question to ask yourself is: Should I trust this VPN provider? More often than not, you can’t and shouldn’t.

Why? Not least because VPN providers don’t always encrypt your web traffic, or don’t use their own domain name servers (which means your internet provider can still see the websites you’re accessing), and some are using their own in other countries, which means you’re beholden to their laws. As security researcher Troy Hunt said in a recent blog post, because VPN providers control your traffic, “they can inspect it, modify it, log it, and have a very good idea of what it is you’re up to.”

As security reporter Brian Krebs notes, many VPN providers “claim they keep zero records of customer activity,” but “this is almost always untrue if you take the time to read the fine print.”

Often, the reality most will face is that you’re paying for a VPN service that you have to trust more than your internet provider not to collect, monitor, or sell your data.

As famed security sensation Swift On Security said in a recent tweet:

It’s not to say that there aren’t good VPN providers out there, but you have to weigh up the reasons why you want to protect your browsing history and other data.

When push comes to shove, there are better ways to protect your browsing data than using a VPN.

And while nothing is perfect, and the web will never be completely secure (nothing ever is), you’re better off taking advantage of plugins like HTTPS Everywhere, which pushes for secure pages over non-secure pages where available.

Amazon, Google, Facebook, Twitter — and yes, even Pornhub and YouPorn, your favorite online adult destinations, all offer HTTPS by default, which masks the page and its content (albeit not the domain) from internet-browsing snoopers.

And when all else fails, your internet provider isn’t going to be able to monitor your activity on the Tor anonymity network any time soon.

And there’s almost never going to be a widespread adoption of a VPN service from the average internet user, nor should there be. And many will inherently choose convenience, ease, and faster speeds over security and slowdowns, defeating their point altogether.

Given that this entire saga started with Congress voting to scrap internet privacy rules, the question to ask isn’t “how do I protect my internet history,” it’s “how do we get out of this mess?”

From:http://www.zdnet.com/article/for-internet-privacy-a-vpn-will-not-save-you/

How to use a VPN to protect your internet privacy

A virtual private network can go a long way to make sure that neither your ISP, nor anyone else, can snoop on what you do on the internet.

Worried about your ISP? Is someone on your coffee shop’s Wi-Fi? Or is Joe A Hacker bugging your internet? A virtual private network (VPN) can help protect your privacy.

A VPN uses encryption technologies, such as IP security (IPSec), Layer 2 Tunneling Protocol (L2TP)/IPSec, and Secure Sockets Layer (SSL) and Transport Layer Security (TLS), to create a virtual encrypted “tunnel” between your device and a VPN server. While your traffic is in this tunnel between you and a VPN server, no one can see where you’re going or what you’re doing.

Besides protecting your privacy, VPN services are also commonly used for BitTorrent and other Peer-to-Peer (P2P) traffic since many ISPs frown on file-sharing. People also use VPNs to watch streaming video services, such as Netflix and Hulu, in areas where they aren’t legally available. In recent years, the streaming services have taken steps to prevent VPNs from carrying their traffic.

While you can set up your own VPN server, such as OpenVPN on Ubuntu, that’s too much work for most people. Your employer may offer VPN services for remote users; if so, ask if you can use it from home.

VPN services

For most people the answer is to use a VPN service. These companies enable you to create a VPN between your gadgets and their internet connection. Once your connection is on the other side of their VPN server, your traffic emerges without signs of who you are or where you’re connecting from.

There’s one fundamental concern with VPN services: Can you trust them not to track you? Some VPNs keep their own records of where you go on the net. If privacy is a real concern for you, check your VPN’s terms and policies to see if they keep logs of your online activities. If they do, look for another VPN.

Some VPNs are far shadier than just logging your visits. In 2015, the free VPN service Hola was found to be selling its users’ bandwidth to its Luminati service’s paying customers.

You should also know before subscribing to a VPN service that you can be almost certain your internet speed will decline. That’s because you’re going to be sharing the VPN’s broadband connection with other users. As always, an internet connection is only as fast as its slowest link.

The great majority of VPN services require you to install an application on your device. Many, but not all of them, also support Android and iOS, so you can secure your mobile traffic.

Most VPN services charge for their services. After all, a VPN provider must, at a bare minimum, pay for its own network equipment and broadband. Nonetheless, there are some decent free VPN services.

Some, such as Spotflux and Hotspot Shield, do this by placing ads in your stream. Others, including Steganos Online Shield and TunnelBear, will give you a free tunnel for a limited amount of traffic. These two offer 500 Megabytes of bandwidth per month. Of the free services, I prefer Spotflux, but if you have minimal bandwidth needs, TunnelBear is also worthwhile.

As for the paid services, what you want is one with lots of bandwidth and multiple sites. Before subscribing to any of these services, try them out first. Many of them offer free trials, and it’s worth taking them up on this. VPN performance varies wildly — not just from company to company but from place to place. If you live near a VPN endpoint that’s constantly overloaded you won’t be happy, even if your brother across the country is getting great performance with the same VPN service.

That said, I’ve been using VPNs for over a decade and I’ve used many of them. The ones that have worked best for me are Banana VPN, NordVPN, Private Internet Access VPN, StrongVPN, and ZenMate.

Their prices vary. Generally speaking, the longer term you sign up for, such as a year paid in advance, the cheaper the subscription fee. This typically drops the price below $10 a month. But, as I mentioned, try the service first before getting locked into a long-term contract.

Installing a VPN tends to be mindlessly simple. The Opera web browser even comes with a built-in VPN these days.

Beyond VPNs

There are also other services that look like VPNs, one of which is free web proxies. A web proxy is a server that acts as a middleman between you and and a website. It sounds good, but about 75 percent of all free web proxies have recently been shown to be untrustworthy. If you’re already using one and want to know if it’s OK, you can test it with ProxyCheck.

Another popular privacy solution is Tor. This is a software and network pairing that hides your identity by moving your traffic across different Tor servers, and encrypting that traffic. However, there’s every reason to believe that Tor isn’t as secure as its reputation. The Justice Department recently dropped a case because it didn’t want to reveal how it had cracked Tor.

Your ISP isn’t going to be cracking your Tor connection. For most people the real problem with Tor is that its connections tend to be very slow.

Do you really need to worry about any of this? I think you do. Sure, the major ISPs claim they’re not going to spy on you, but I don’t believe them. Even before the government decided to let the ISPs sell your browsing history, the big ISPs have had a track record of playing fast and loose with your privacy.

For most people, the best internet privacy solution is a good, fast VPN provider you can trust. Give the ones I suggested a try. I’m sure you’ll find one you like.

From:http://www.zdnet.com/article/how-to-use-a-vpn-to-protect-your-internet-privacy/

System Requirements

Both OsMonitor Server and Client can work on Windows 2000, Windows XP, Windows Server 2003/2008/2012, Windows Server 2012 R2, Vista,Windows 7, Windows 8/8.1, Windows 10. Include 32 bit and 64 bit.

Customer Review

We are now using your monitoring software, OsMonitor. It is a great software, we are able to block non-business website, monitor activities of our users, website visited and even snap shots. Majority of our need is provided by your software.