Just Given Up Your Eldest Child For Free WiFi? Perhaps I Can Interest You In This Tech Preview?

‘Ello guv’nor, I heard you sold your kid for WiFi. Perhaps I could interest you in another good deal? It’s called tech for privacy and I know you’re gonna luv it.

Having decided to pass on the logical numbering of the next rendition of Windows, Microsoft’s new operating system will be called Windows 10.

In a move many see as an attempt to put the memory of the not-so-popular Windows 8 behind it, the company is all steam ahead as it marches toward the inevitable retail release of its replacement.

In the meantime, however, early adopters can grab a technical preview to see how Redmond has accommodated Start button-loving fans of its arguably much better Windows 7.

Being one of the first people to get your hands on a new operating system may sound pretty cool but that will only be the case if you read the privacy policy first (something you should always do before installing new software).

Why?

Because Microsoft sharing the tech preview with you is a reciprocal agreement which sees your data travel back in the opposite direction.

Specifically, the Windows Insider Programme policy says,

“Microsoft collects information about you, your devices, applications and networks, and your use of those devices, applications and networks. Examples of data we collect include your name, email address, preferences and interests; browsing, search and file history; phone call and SMS data; device configuration and sensor data; and application usage.”

While the sheer volume of collectible data is staggering and far beyond what I for one would be happy to give up if I had a choice, it is standard fare these days, mores the pity.

More disconcerting though are these two following entries:

“We may collect information about your device and applications and use it for purposes such as determining or improving compatibility” and “use voice input features like speech-to-text, we may collect voice information and use it for purposes such as improving speech processing.”

and

“If you open a file, we may collect information about the file, the application used to open the file, and how long it takes any use [of] it for purposes such as improving performance, or [if you] enter text, we may collect typed characters, we may collect typed characters and use them for purposes such as improving autocomplete and spellcheck features.”

Did that sink in?

If not read it again and you will see that signing up for the Windows 10 preview will see you giving Microsoft permission to both record your voice and, specifically what you say, and to collect everything you type on your keyboard.

In other words, you will be voluntarily installing voice and keyloggers onto any system running this version of Windows.

Ouch!

There is no word on whether the privacy policy will be similarly worded when bundled with the final version and I suspect, and hope, that it won’t – I’d like to think that Microsoft is merely gathering so much data to help it make improvements to the new operating system before its retail release.

But there are no guarantees of anything these days, especially where technology is concerned and, likewise it seems, in the realm of data gathering.

So, my advice, is to research Windows 10 thoroughly upon its general release and to check out its privacy policy in its entirety before letting it anywhere near any of your devices.

Alas, most people will not do so though. After all, the latest tech is often so enticing that people will do the craziest things to get on the bandwagon.

Is It Worth Sacrificing Privacy For A Bit Of Geeky Self-Quackery?

Wearable tech. It’s all the rage don’t you know.

From glassholes (not you Neira, you’re cool) to joggers with glorified digital watches, people everywhere are getting excited about the next big thing in what I would describe as self-eroding privacy.

Whilst Google Glass owners may be in short supply, possibly put off by the cost, the number of people owning health and fitness gizmos seems to be on the rise, aided and abetted by other cool-to-have devices such as the newly released iPhone chunky that can help tap into all that data.

In some ways I can see why the ability to monitor fitness metrics could be quite enticing, allowing users to set their own goals and to motivate themselves through self-stretching of targets or via competition with others.

That said however, some performance measurements can lead to disappointment if you start getting into e-competition with other people who may have published their own results online, either intentionally or inadvertently (yes lads, two minutes of moderate exertion is pretty lame, or at least that’s what she said).

And that’s the problem you see – some health, wellness and fitness data should remain private from your family and even the lads or ladies down the pub. And I’m not just talking about the obvious faux pas linked to above either – other data really shouldn’t be common knowledge in my opinion, or at least not so common that it appears on the web.

Comparing heartbeats and other metrics at the gym could be a good thing but sharing such data with a mechanism that is easily scoured and mined by who knows who is not so good is it? I mean, would you want your insurance company to know that you are a 30-year-old with the fitness level of a pensioner? It’s ok, I know it’s not your fault, it’s all that sitting at a desk and the pizzas, well, they’re just too nice. But what would an underwriter think? Higher premiums perhaps? I don’t see why not.

After all, who are you sharing that data with? Do you even know? Has the app developer made it clear during the signup and installation routine? Did you even bother reading all that gumpf when you downloaded it?

Does the app developer have a social networking aspect where you can share and compare data? Who has access to what? Is the data made public such as in the example above where ‘performance’ data appeared in Google search results? Are data-storing websites secure? Does your smartwatch company sell your data to third parties or share it with them?

So many questions, all of which could have a huge impact on your privacy.

And just what benefits are you getting any way?

Is your health improving? Will a wearable make you fitter? Surely self-motivation is key, not technology.

And what does your doctor make of all this data you are producing about your health? Not much, to be honest. In fact a new survey of physicians here in the UK highlights a potential problem with the new army of high-tech health buffs – many are self-diagnosing but they’re not very good at it.

In fact, less than 5% of doctors thought that health apps and websites offered any kind of value as patients start taking it upon themselves to figure out their own health and fitness routines or even research their own perceived medical conditions.

Heaven forbid that someone would take the advice of a watch over their GP but I guess its happening already and will only become more commonplace in the future.

In case you haven’t guessed already, I don’t like wearable tech. It’s too invasive by nature and the data it produces is arguably not secure or private enough by default, nevermind should someone ever decide to target it. And it’s usefulness? For some people such devices could be invaluable in enhancing their training routines but then I would guess such people would probably do ok without it anyway. For everyone else? What do you think?

Facebook’s Exploding Posts: Mission Impossible vs. Robin Of Sherwood

This Facebook post will self-destruct in 5 seconds.

Well, ok, maybe not 5 seconds. But your latest Facebook post could soon be gone in a timescale chosen by you (well, ok, anywhere between 1 hour and a week).

A small number of users spotted the new feature in a Facebook iOS app earlier this week which allows users to set a deletion date at the time they create a new post.

A Facebook spokesperson confirmed the existence of the trial feature, saying:

“We’re running a small pilot of a feature on Facebook for iOS that lets people schedule deletion of their posts in advance.”

Small scale trials of new features are nothing new for the social networking giant which is constantly looking to evolve. Facebook users will be grateful, however, that this one is not as secret as say, testing how users react to positive and negative news, the secret emotion experiment which recently surfaced and did little to enhance the reputation of a company which many fail to equate with privacy protections.

That said, Facebook may be learning what its users want, as evidenced by the recent addition of the ‘privacy dinosaur’ aka the new privacy checkup tool.

So, that means all users will be able to self-destruct all of their postings in the future, wiping them off Facebook’s servers for ever more, right?

Well, before you see Facebook as a means for posting questionable or sensitive content, you may wish to consider the fact that the answer to that question is not clear – it looks like the removal from a user’s timeline will be permanent but I’d be very surprised if Facebook would want to let anything fall off its own servers (we know it keeps a record of anything typed into the status box, regardless of whether the user subsequently decides to publish it or not, for example).

Then of course there is the fact that virtually nothing shared on the web is private ever again anyway – the kids of today ain’t half bright you know and they can take screenshots and everything.

So before you even start contemplating using Facebook’s potential new service, or the Slingshot app, or Snapchat to post something you otherwise may have kept to yourself (or should have) remember that nothing that is published can be unpublished and privacy can sometimes be an illusion. Or, as my boyhood hero Michael Praed would say, “Nothing is forgotten. Nothing is ever forgotten.”

Microsoft Sings We’re Not Gonna Take It, Invites Contempt

“See you in court!”

Aaargggh, no thanks, that sounds like a mighty stressful and bank balance-busting exercise in futility to me.

But then again, I’m not Microsoft so perhaps I’ve got good reason to not want to end up in front of a judge. Not that I’ve done anything wrong of course. Honest. Just ask GCHQ – its minority report division already knows I’m a saint now and will continue to be so in the future too.

Microsoft, however, is so keen to have its say in court that it has invited proceedings upon itself. Kind of.

After US authorities made demands over emails stored on a Microsoft server in Dublin, Ireland, the software giant said no dice and has now taken the unusual step of asking the US government to hold it in contempt of court so that it can accelerate the privacy-based case onto the appeals stage.

The case centres around a series of emails which are said to to be relevant to an investigation into drug trafficking but, despite the potential gravity of that case, Microsoft disagrees with the government view that data held overseas is there to be grabbed, instead suggesting that US jurisdiction should terminate in line with its physical borders.

An outstanding warrant, about which almost nothing is known publicly, has caused Microsoft much consternation with the company promising to appeal any adverse ruling “promptly.” The company objected to the search on many levels, including the fact that it believes an existing precedent applies:

“The U.S. has entered into many bilateral agreements establishing specific procedures for obtaining physical evidence in another country including a recently-updated agreement with Ireland. We think the same procedures should apply in the online world.”

In a blog post, the company also highlights how it is taking the moral high ground in making a stand for privacy and also cites backers such as Apple, Cisco and the EFF.

None of this is to say that Microsoft feels it is above the law though, merely that it believes that government should play by the rules and follow established processes:

“We appreciate the vital importance of public safety, and we believe the government should be able to obtain evidence necessary to investigate a possible crime. We just believe the government should follow the processes it has established for obtaining physical evidence outside the United States.”

Now, after some procedural confusion, US District Judge Loretta Preska has found Microsoft in contempt, allowing the company to proceed with its appeal immediately. Meanwhile Microsoft has come to an agreement with the Department of Justice that allows it to escape punishment for that ruling, though the government said it retains the right to seek sanctions at a later date if it feels it necessary to do so, with the full stipulation saying:

  1. Microsoft has not fully complied with the warrant, and Microsoft does not intend to comply while it in good faith seeks further review of this Court’s July 31 decision rejecting Microsoft’s challenge to the Warrant.
  2. While Microsoft continues to believe that a contempt order is not required to perfect an appeal, it agrees that the entry of an order of contempt would eliminate any jurisdictional issues on appeal. Thus, while reserving its rights to appeal any contempt order and the underlying July 31 ruling, Microsoft concurs with the Government that entry of such an order will avoid delays and facilitate a prompt appeal in this case.
  3. The parties further agree that contempt sanctions need not be imposed at this time. The Government, however, reserves its right to seek sanctions, in addition to the contempt order, in the case of (a) materially changed circumstances in the underlying investigation, or (b) the Second Circuit’s issuance of the mandate in the appeal, if this Court’s order is affirmed and Microsoft continues not to comply with it.

Trading Privacy For Security In the Job Market

Personal data from Facebook, Twitter and other social media sites will be monitored more by employers over the next decade, according to a new report from PwC, which says that one third of young people would happily trade in their privacy in return for a little job security.

The future of work: A journey to 2022 report surveyed 10,000 workers around the world as well as 500 human resources professionals in order to guage their attitude towards their social media use being monitored by their employers.

The report suggests that data available through Facebook, Twitter and other social channels could be used by employers to gain an insight in to what motivates their workforce along with other information including why staff change jobs and what could be done to improve their wellbeing within the organisation.

John Harding, human resource services partner at PwC in Manchester, said:

“Just as advertisers and retailers are using data from customers’ online and social media activity to tailor their shopping experience, organisations could soon start using workers’ personal data (with their permission) to measure and anticipate performance and retention issues.

This sort of data profiling could also extend to real-time monitoring of employees’ health, with proactive health guidance to help reduce sick leave. Key to the success of organisations being able to use employee data will be developing measurable benefits for those who hand over their data and building trust through clear rules about how data is acquired, used and shared.”

According to the research, half of the global workforce will be aged 32 or under by 2020, which will see a shift in attitude towards the use of technology and personal data. The PwC report says that these younger workers are far more relaxed about the sharing of data than previous generations, with 36% saying their employer is welcome to their personal data.

Whilst I can see why an employer would love to gain access to an employee’s social postings, either by viewing what is publicly available or via explicit consent, I struggle to see how the staff member gains from such an agreement.

By giving an employer permission to access their social media accounts, the individual would be giving up their privacy for very little return. The employer would gain all sorts of insight into how their staff think and what they do with their time when away from the workplace but I fail to see how that could be used to motivate them further, or increase their feeling of wellbeing. From the employees’ point of view I can see nothing to gain whatsoever. How giving up access to their social media accounts would lead to the claimed increase in job security I do not know.

This just seems to be another case of the general poulace giving up their rights for very little in return. Or, as Benjamin Franklin may have said “Those who surrender their social media accounts for job security will not have, nor do they deserve, either one.”

Considering the laid back attitude many youngsters have towards the sharing of their personal data these days I do wonder if, in the future, that approach will come back to bite them where it hurts.

Tesco Hudl – Every Little Data Reset Flaw DOESN’T Help

If you have some old tech you want to sell then eBay may be your first port of call. As much as I dislike the site and some of its practices, it still presents a means of putting unwanted goods in front of a huge number of eyeballs. But the problem with that it is it has generated a marketplace that appeals to a massive number of people, many of whom are not as security conscious as perhaps they could be.

I myself have bought a second-hand laptop in years gone by, only to discover that the previous owner had made absolutely no attempt whatsoever to clear their private data from the machine. I discovered his favourite websites (I hope he visited THAT site when his wife wasn’t around), I know who he banked with, I wasn’t partial to his taste in music, but I did agree strongly with the Liverpool FC background he left on it.

Ultimately, what I learned is that some people lack the security awareness, or are too lazy, to wipe their personal data from computers and other devices before disposing of them via an auction site or the local tip. Based upon hard drives I’ve been given by friends, it is a widespread problem which we can only hope to eradicate by raising the issue and educating people.

But sometimes education isn’t enough.

Take the Hudl tablet for example. Ken Munro of Pen Test Partners recently conducted an experiment, in conjunction with the BBC, in which he examined the data deletion systems on Android devices.

Purchasing second-hand Hudls from eBay, Munro discovered that even those previous owners who had wiped the device before shipping were at risk of having their confidential data accessed.

Munro found that the device retained information even after a factory reset due to a flaw in the Rockchip processor’s firmware. The known bug allowed him to read and write to the device using freely available software. Extracting information only took minutes but the analysis of the data typically took a couple of hours per machine. Once done, however, Munro was able to determine PIN codes, wi-fi keys, cookies and other browsing data that would have allowed him to spoof the original owner.

A Tesco spokesperson told the BBC that:

“Customers should always ensure all personal information is removed prior to giving away or selling any mobile device. To guarantee this, customers should use a data wipe program.”

The spokesperson went on to say that any Hudls returned to Tesco would be securely wiped by the company, but urged users to visit the Get Safe Online website if they have any further privacy-related concerns.

Marc Rogers, principal researcher at Lookout, explained further, saying that a secure wipe should be used before disposing of any data-storing device. Such a wipe will overwrite all onboard memory with ones and zeroes, rendering it useless to any third party that later tried to access it. Unfortunately though, most manufacturers have adopted a different approach to factory resets he said:

“There’s an Android function to wipe data and most manufacturers are using that. But all that does is remove the index of where data is and does not delete data at all.”

Lookout also revealed that police had revealed that the average underground price for a second-hand smartphone with personal data on it was around £600, which just goes to show the potential value of that data to the crook who ends up buying it.

As sales of smartphones and tablets increase, in part due to their convenience and portability, it is increasingly likely that owners will entrust more and more data to them. When those devices are subsequently sold on the selfies left in memory may provide the new owner with a few chuckles, but there is a chance that the banking data, credit card numbers and less than safe for work snaps may leave the original owner with something far more tangible than the thought of a stranger laughing at them.

So, if you are selling a Hudl, or any other device that has previously held your personal data, ensure that you wipe it securely before placing that listing.

Yahoo Set To Enable Email Encryption For All Users By 2015

In the same week that Google announced that it will give a search ranking boost to security-conscious websites, Yahoo has now revealed that it too will take a proactive stance on encryption.

The company announced at Black Hat that it will apply end-to-end encryption to its email services before the end of 2015.

The move is likely in response to the Edward Snowden revelations about government surveillance that have prompted many tech firms to assess their stance on privacy and encryption.

Thus far, Google has taken the biggest strides, with the aforementioned ranking change following previous announcements of support for end-to-end encryption in its Mail, Drive and Search products.

The change will likely be welcomed by Yahoo’s 273 million email account holders who had previously been left behind as other email providers adopted encryption.

Yahoo’s encryption will not hide details such as who has emailed who, or the contents of the subject line, but the contents of the message will be covered by a version of PGP encryption which has so far not been cracked.

In an interview with the Wall Street Journal, Yahoo chief information security officer Alex Stamos said:

“We have to make it to clear to people it is not secret you’re emailing your priest. But the content of what you’re emailing him is secret.”

PGP relies upon both the sender and receiver of an email having their own encryption key which could potentially lead to similar problems as those experienced at Lavabit which closed down after being force to hand its keys over to the authorities.

Yahoo and Google, however, both claim that they will not hand keys over, not least because they are massive companies with the funds required to finance a large number of lawyers, with Stamos saying:

“That’s very different from a publicly traded multibillion dollar company with an army of lawyers who would love to take this argument all the way to the Supreme Court.”

Mark James, security specialist at ESET welcomed the news but pointed out that the average man in the street may not understand how to take advantage of the change:

“It’s great that two of the largest internet email providers will be offering us the ability to send end-to-end encrypted emails to each other. After Google announcing it was doing the same thing a few months ago it is good to see another leading email provider following suit.

It won’t mean a lot to the average user but anyone who wants to protect their emails when using these providers will be able to do so by using these browser extensions.

So what does it actually mean? Well once the browser extension is added and configured you will be able to send an email with the contents completely scrambled to anyone except the sender and receiver. No one will be able to read the content. There are many encryption tools available for those that want to install and use them but for the average user they are often scary to set up. I for one welcome any type of “easy” security.”

I personally hope that Yahoo and Google do make their email encryption easily understandable by the less savvy web users out there though because we seemingly live in a society where having nothing to hide doesn’t mean no-one will go looking anyway.

Raspberry Pi, Wearables And The Low Cost Of Surveillance

Are you whiling the time away until you get your first smartwatch or preparing to run to the local store to buy the latest fitness tracker?

If so, you may wish to know that snoops can track such devices and at a fraction of the prices you will be paying for the latest in wearable tech.

New research from Symantec has shown that it is possible to track individuals, even in crowded places, via cheap and readily accessible hardware.

The security firm took a Raspberry Pi and added components including a Bluetooth 4.0 adapter, SD card and battery pack. All-in, the home-made tracker cost around $75 which is about £44/56 Euros.

The company took a number of such devices to busy public locations in both Switzerland and Ireland, as well as a major sporting event, and ran them in passive mode. By simply scanning the airwaves for signals broadcast by wearables, the RasPis were able to successfully track each and every one of them via their serial numbers or a combination of other factors, prompting the researchers to say:

“In our testing, we found that all the devices we encountered can be easily tracked using the unique hardware address that they transmit. Some devices (depending on configuration) may allow for remote querying, through which information such as the serial number or a combination of characteristics of the device can be discovered by a third party from a short distance away without making any physical contact with the device.”

The researchers also delved further into wearable tech and the associated apps, looking for other potential security and privacy concerns, and it found several.

Symantec discovered that 52% of the self-tracking apps it examined did not have a privacy policy which, it says, may suggest that the developers do not take security and privacy as seriously as they perhaps could.

Researchers also discovered a large amount of unintentional data leakage with the average app contacting 5 domains (one even contacted 14 domains) in a short period of time. Whilst there may be legitimate reasons for a fitness or other tracking app to contact a number of domains for the transmission of data or to serve ads, for instance, Symantec said that the number of domains being contacted increased the risks of data leakage through human error, social engineering or careless or malicious handling of data.

The researchers also discovered other concerns, such as weak session management, which could lead to session hijacking, which could in turn lead to further problems.

Symantec’s blog post ends with the company pointing out that self-tracking apps and devices are not synonymous with privacy and suggesting that those who value their privacy will not get involved in self-tracking in the first place (I agree).

However, knowing that many users will continue to use fitness trackers, smartwatches, etc., regardless, the company offers up the following tips which I would describe as being little more than damage limitation rather than a security solution:

  • Use a screen lock or password to prevent unauthorized access to your device
  • Do not reuse the same user name and password between different sites
  • Use strong passwords
  • Turn off Bluetooth when not required
  • Be wary of sites and services asking for unnecessary or excessive information
  • Be careful when using social sharing features
  • Avoid sharing location details on social media
  • Avoid apps and services that do not prominently display a privacy policy
  • Read and understand the privacy policy of app and services
  • Install app and operating system updates when available
  • Use a device-based security solution if available
  • Use full device encryption if available

If you would like more information on Symantec’s research a whitepaper can be found here.

Information Commissioner’s Office Reports On Big Data And Privacy

The Information Commissioner’s Office (ICO) has today released a new report that considers how big data will operate within existing data protection laws which ensure that personal information is:

  • Fairly and lawfully processed
  • Processed for limited purposes
  • Adequate, relevant and not excessive
  • Accurate and up to date
  • Not kept for longer than is necessary
  • Processed in line with your rights
  • Secure
  • Not transferred to other countries without adequate protection

The Big data and data protection report accepts that the use of big data can bring benefits to companies and doesn’t wish to stifle innovation. That said, the ICO is keen to point out that organisations still have an obligation to keep information both private and secure, offering the following practical advice for dealing with personal information used in big data analytics:

  • Personal data – Does your big data project need to use personal data at all? If you are using personal data, can it be anonymised? If you are processing personal data you have to comply with the Data Protection Act.
  • Privacy impact assessments – Carry out a privacy impact assessment to understand how the processing will affect the people concerned. Are you using personal data to identify general trends or to make decisions that affect individuals?
  • Repurposing data – If you are repurposing data, consider whether the new purpose is incompatible with the original purpose, in data protection terms, and whether you need to get consent. If you are buying in personal data from elsewhere, you need to practice due diligence and ensure that you have a data protection condition for your processing.
  • Data minimisation – Big data analytics is not an excuse for stockpiling data or keeping it longer than you need for your business purposes, just in case it might be useful. Long term uses must be
    articulated or justifiable, even if all the detail of the future use is not known.
  • Transparency – Be as transparent and open as possible about what you are doing. Explain the purposes, implications and benefits of the analytics. Think of innovative and effective ways to
    convey this to the people concerned.
  • Subject access – People have a right to see the data you are processing about them. Design systems that make it easy for you to collate this information. Think about enabling people to
    access their data on line in a re-usable format.

The ICO’s head of policy delivery, Steve Wood, says that there is a buzz around how big data can be used for social benefits as well as the more obvious economic advantages it can provide. He did, however, highlight how organisations are struggling to understand how they can put big data to innovative new uses without falling foul of the law. Wood also explained that individuals are also expressing concern over how their personal data is being used in big data scenarios.

The answer, he says, begins with organisations being more transparent about how they are using big data:

“What we’re saying in this report is that many of the challenges of compliance can be overcome by being open about what you’re doing. Organisations need to think of innovative ways to tell customers what they want to do and what they’re hoping to achieve.

Not only does that go a long way toward complying with the law, but there are benefits from being seen as responsible custodians of data.”

The ICO report says that openness is a key factor, pointing out how organisations need to ensure that personal information is only used in ways previously communicated to users. The complexity of big data, it says, should not be used as an excuse to use data without consent.

Responding to concerns that existing data protection law is insufficient in the face of big data, Wood added that:

“Big data can work within the established data protection principles. The basic data protection principles already established in UK and EU law are flexible enough to cover big data. Applying those principles involves asking all the questions that anyone undertaking big data ought to be asking. Big data is not a game that is played by different rules.
The principles are still fit for purpose but organisations need to innovate when applying them.”

The organisation notes how the area of big data is fast-evolving, leading it to conclude that its guidance will likely change over time. In light of that, the ICO positively encourages feedback which can be sent to [email protected] up until September 12 of this year.

Are We The Architects Of Our Own Insecurity?

Its a well known fact that people men are obsessed with something. (Note to self: make that two things but don’t mention the first).

Go to any shopping centre on a Saturday and you’ll notice all manner of sideways glances, secret peeks and longing stares as men of all ages centre their attention on anything but their significant others.

The object of their desire, of course, is technology. Like bees around a honey pot, we can’t help ourselves – new tech captivates us in ways we cannot explain and creates a longing and desire that nothing else can satisfy.

Boredom with the old and interest in the new is fed by some sort of crazy attention deficit that is ingrained into our very DNA I swear.

Technology manufacturers love it though. Such an interest, that is almost always backed up by demand where funds permit, drives them to create new products like there is no tomorrow.

But the never-ending rush to bring new ‘toys’ to market does have drawbacks.

The biggest one that I can see is the fact that the security issues surrounding new technology never seem to be given the attention they deserve ahead of a product release and are instead only considered later, in response to particular incidents or third-party research (think IoT for instance).

Additionally, as we now know thanks to Edward Snowden, some governments have their own agendas when it comes to technology, seeing computers, phones and tablets as an extension to their national surveillance campaigns.

Some nations are not standing for it though, as evidenced by China’s claims on Friday that the iPhone represents a security threat to the state. The national TV broadcaster criticised the iPhone’s  “Frequent Locations” function, saying that access to the data “could glean sensitive information such as the country’s economic situation or ‘even state secrets.'”

Apple hit back by saying that it ” does not track users locations – Apple has never done so and has no plans to ever do so,” adding that it had “never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will. It’s something we feel very strongly about.”

Whilst the iPhone is still freely available for sale in the country at this time I would not be surprised if that changes very soon, given the fact that China also moved quickly to outlaw the use of Windows 8 within government agencies.

The same state TV service branded Microsoft’s operating system as a threat to the nation’s cybersecurity, saying that it posed a “big challenge” and suggesting that the NSA may be using it to gather data.

Then there is the case of Russia which, shortly after Snowden’s defection, swiftly swapped computer hardware within the Kremlin for good old-fashioned typewriters in order to improve its security whilst creating a means for linking any created documents to a particular machine.

By way of contrast, the United Kingdom government is having a whale of a time with all this new technology, seizing upon the perceived threat of terrorism, peadophiles, etc., to rush in a law – which I think is draconian in nature – which will allow it to hold onto metadata for an entire year (if you want to know why that should concern you, whether or not you think you have ‘something to hide’, and including why it may pose a threat to democracy, then I highly recommend this recent post from Sarah Clarke in which she looks into the proposals in detail).

The fact that the UK government is doing a rush job on getting the proposals through Parliament leave little to no opportunity for MPs to debate the Bill and just as little time for us mere mortals to do the same either but what is noticeable is the fact that we, as a nation, are not standing up when practises that threaten our security and privacy are brought to our attention in the way that the likes of China and Russia are.

Maybe we don’t need to because, after all, we live in a democracy and our elected officials are there at our whim to do as we ask after all.

But then again I don’t feel that way myself – I think that we are allowing technology to control our lives to some degree rather than make them simpler and we are too blind to see what is happening.

I believe that new technology is a good thing but the way in which much of it is utilised these days warrants a level of scrutiny and subsequent control that just isn’t there right now. Alternatively, the insecurity could be all mine.