Monthly Archives: August 2016

EU: Apple ‘avoided tax on profits made across the EU’ – but right now it owes €13bn in Ireland

Written as editor of the New Statesman’s NS Tech and first published here.

Apple must pay €13 billion (plus interest) to the Irish exchequer in a landmark EU ruling that has judged its tax arrangements in Ireland to be “illegal” under state aid rules.

The competition commissioner Margrethe Vestager has finally ruled after a two-year investigation that the company was given “illegal” benefits that gave it an unfair advantage.

“The Commission’s investigation concluded that Ireland granted illegal tax benefits to Apple, which enabled it to pay substantially less tax than other businesses over many years.

“In fact, this selective treatment allowed Apple to pay an effective corporate tax rate of 1 per cent on its European profits in 2003 down to 0.005 per cent in 2014.”

The ruling relates to two tax deals made by Apple in Ireland in 1991 and 2007.

The commission disputed the ability of what it called Apple’s ‘head office’ in Ireland to generate $22 billion in profit that went largely untaxed. “The ‘head office’ did not have any employees or own premises,” the commission said.

Going further than this, though, the commission said Apple has been able to “avoid taxation on almost all profits generated by sales of Apple products in the entire EU Single Market”.

It’s illegal under EU state aid rules for tax arrangements to give companies an unfair advantage within one country, however, the cross-border tax issue is as yet outside the remit of EU state aid control.

Our friends over at Currency Fair were quickly to LAH.

Although the EU said this ruling was not designed to undermine the wider Irish tax system, Ireland’s Minister for Finance Michael Noonan is likely to appeal the decision.

“This is necessary to defend the integrity of our tax system; to provide tax certainty to business; and to challenge the encroachment of EU state aid rules into the sovereign member state competence of taxation.”

Indeed, Ireland has already spent more than €670,000 trying to defend itself against the ruling.

US regulators last week outlined their shared “concern with tax avoidance by multinational firms”  but said:

“These investigations, if continued, have considerable implications for the United States — for the U.S. government directly and for U.S. companies—in the form of potential lost tax revenue and increased barriers to cross-border investment. Critically, these investigations also undermine the multilateral progress made towards reducing tax avoidance.”

Jeremy Corbyn heads to trendy East London to launch plans for public eBay and Google

Written as editor of the New Statesman’s NS Tech and first published here.

Jeremy Corbyn has headed to civic tech space Newspeak House in trendy East London to launch a digital democracy manifesto that he intends to take into the next general election.

While giving a nod to Skype and Google for transforming our everyday lives, he said the internet’s advances could become “forces of inequality and exploitation”.

This plan, he said, would “democratise the internet”, while also acknowledging the many people who face social exclusion as government and politics moves online.

He said Labour would ensure that “no community is left behind”.

None of the pledges are brand new ideas, several are already being done to some extent by the current government, but here they are:

  • £25 billion investment in a Universal Service Network to give mobile and broadband internet access to the whole of the UK

“Inequality of coverage is not trivial,” Corbyn said. He called it a “barrier to social and educational opportunity”. The policy is not unlike the Conservative Party’s Universal Service Obligation, which has been a long and hard thing to deliver.

  • Open Knowledge Library as a hub for lessons and curriculum, a public Google platform?
  • Platform Cooperatives that operate as a public trading platform for digital goods and services, like eBay and Taskrabbit in one?

This would be enhanced by “reformed copyright laws” to help protect the UK’s cultural workers and a “new kind of trade union membership” for digital workers.

  • Digital Citizen Passport for people to interact with government and private companies, not unlike (abandoned?) midata project or current GOV.UK Verify
  • Library for open source software that has been publicly funded
  • People’s Charter of Digital Liberty Rights, announced earlier this year, to “protect people from unwarranted surveillance” and enshrine privacy and freedom of speech
  • Massive Multi-Person On-line Deliberation to let people participate in policy decisions

Corbyn also made mention of online voting in elections but said this would have to be “open to widest possible consultation”.

Should tech companies join the government’s counter-terrorism unit?

Written as editor of the New Statesman’s NS Tech and first published here.

MPs reporting on the radicalisation of young people in the UK couldn’t be more clear on where they stand on the impact of digital on this issue:

“The use of the internet to promote radicalisation and terrorism is one of the greatest threats that countries including the UK face.”

The ‘Radicalisation: the counter-narrative and identifying the tipping point‘ report made big headlines because Twitter and Facebook received lengthy criticism for not being proactive enough on policing ‘extremist’ content.

Google’s YouTube received the most high praise for implementing a rapid-response flagging system so a trusted group can highlight potentially harmful material and alert the company as quickly as possible.

Twitter was singled out because it does not proactively notify police of material that poses a threat to life, it said, because the content is public for anyone to find. But the company did confirm it had more than 100 people working on this issue and, between mid-2015 and February 2016, had suspended 125,000 accounts. Google said it had removed 14 million videos globally in 2014.

All three companies, along with Microsoft, have also recently signed upto new EU rules on tackling illegal hate speech, the MPs conceded.

It’s clear now that these aren’t just IT or tech or social media companies, they are big parts of people’s lives, huge distributors of content and therefore can’t hide behind their relative newness.

A whole five pages of the report was dedicated to the role of tech platforms, while just five paragraphs was dedicated to the old media: “In short, what cannot appear legally in the print or broadcast media, namely inciting hatred and terrorism, should not be allowed to appear on social media,” the MPs said.

It’s not unsurprising that websites distributing information with billions of users, versus traditional media outlets with relatively insignificant audiences, have found themselves here, as MPs explained:

“The internet has a huge impact in contributing to individuals turning to extremism, hatred and murder. Social media companies are consciously failing to combat the use of their sites to promote terrorism and killings. Networks like Facebook, Twitter and YouTube are the vehicle of choice in spreading propaganda and they have become the recruiting platforms for terrorism. They must accept that the hundreds of millions in revenues generated from billions of people using their products needs to be accompanied by a greater sense of responsibility and ownership for the impact that extremist material on their sites is having…

“These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands. If they continue to fail to tackle this issue and allow their platforms to become the ‘Wild West’ of the internet, then it will erode their reputation as responsible operators.”

MPs have called on these companies to produce quarterly public reports on their efforts in this area, detailing what they have removed and why.

MPs also floated the suggestion, made in the evidence of Baroness Shields, the government’s Minister for Internet Safety and Security, that tech companies should invest in technology to “automate the identification and removal of dangerous extremist content”. We know we can’t always trust computers to decide what the right thing to do it, but there you go.

But, in what is likely an unprecedented move, the MPs also suggest that tech companies set up permanent home within the police’s Counter Terrorism Internet Referral Unit.

“It is odd that when taking down dangerous and illicit material the CTIRU needs to waste time trying to establish contact with organisations outside the unit. Representatives of all the relevant agencies, including the Home Office, MI5 and major technology companies, should be co-located within CTIRU. This will enable greater cooperation, better information-sharing and more effective monitoring of and action against online extremist propaganda.”

It’s one thing to recognise the status and power of these newer companies in the media market, but this recognition surely comes with some ethical need for independence?

The new Independent Press Standards Organisation (IPSO) was unable to comment on this issue, although they have recently launched a digital review of regulations and the way they apply to global digital publishers.

NS Tech has reached out to the Editors Code of Practice Committee, as well as Facebook, Twitter and Google to find out whether they think social media companies now need to be more regulated – and therefore also independent – like the press.

We will update when we hear back.


Facebook and Google declined to comment on joining the counter-terrorism unit.

iPhone hack proves mobile is the new battleground – and it’s us humans that are truly vulnerable

Written as editor of the New Statesman’s NS Tech and first published here.

Apple’s issued a major patch to iOS 9 after a human rights activist reported a strange text message that was found to contain three zero-days vulnerabilities.

Long thought to be more secure than desktop computers, this smartphone attack would have enabled hackers to see inside the user’s device, including tracking his movements, recording phone calls and logging messages.

An experienced avoider of state surveillance, Ahmed Mansoor didn’t click the link, but sent it on to Citizen Lab at the University of Toronto, who worked with security firm Lookout to test the software.

“The implant installed by the [now nicknamed] Trident exploit chain would have turned Mansoor’s iPhone into a digital spy in his pocket,” the researchers said.

“We are not aware of any previous instance of an iPhone remote jailbreak used in the wild as part of a targeted attack campaign, making this a rare find.”

The problem was reported to Apple and a new version of iOS 9 was delivered within 10 days. Updating to the latest version means the attack will no longer work, but of course doesn’t protect people from new, future exploits.

Interestingly, the researchers not only explain that all of the three tools used in the attack were from “lawful intercept” spyware companies, but that the trail is thought to lead back to a US venture capital-owned business, NSO Group.

Citizen Lab believes that Israeli firm NSO Group builds software that is specifically designed and sold to government agencies. Attacks of this level of sophistication could be worth millions to those wishing to target journalists, human rights campaigners and other interesting parties.

“That the companies whose spyware was used to target Mansoor are all owned and operated from democracies speaks volumes about the lack of accountability and effective regulation in the cross-border commercial spyware trade,” the team added.

Speaking last week about a similar set of software hacks believed to belong to the NSA, former US Defense Intelligence Agency officer Michael Tanji issued a harsh reality check:

“If there is a potentially dangerous side-effect to the discovery of a set of 0-days allegedly belonging to the NSA it is the dissemination of the idea, and credulous belief of same, that intelligence agencies should place the security of the Internet – and commercial concerns that use it – above their actual missions…

“The idea that someone, somewhere, working for someone else’s intelligence agency might not also be doing vulnerability research, uncovering exploitable conditions in popular networking products, and using same in the furtherance of their national security goals is a special kind of hubris.”

Yes, the battle is truly on, between security researchers and commercial companies working on behalf of their users (and their brand reputations), and state actors and non-state actors who give no s**** about your digital identity.

Individual citizens, relying on the kindness of strangers to notice, test and then responsibly disclose new threats to the relevant company, before it’s too late, gives us real vulnerability in the vulnerability exploitation business.

Why Hull has cream phone boxes (and why it’s relevant to tech today)

Written as editor of the New Statesman’s NS Tech and first published here.

Hull’s set to become the European Capital of Culture in 2017, cue visions of John Prescott cutting the ribbon looking as cultured as some bloke who’s just quickly pulled on his Sunday best.

The big moment is fast approaching, but it’s not only the city’s political leaders that might be in need of a face lift.

Despite being a significant trading hub as far back as medieval times, Hull’s telecoms infrastructure hasn’t kept pace with technological change.

Although Hull is reportedly the only city in the UK that is getting broadband officially described as “ultrafast” as standard, the leading local network provider KCOM hasn’t yet delivered.

Hull is the only city in the UK to have kept (until 2007) an independent, municipal telephone network provider, that’s KCOM.

Image credit: RM21/Wikimedia Commons
Image credit: RM21/Wikimedia Commons

And that’s why it has distinctive cream phone boxes and its residents received the White Pages telephone directory, rather than Yellow Pages.

But Hull was also one of only two places named in Ofcom’s Connected Nations report in 2015 where more than 30 per cent of businesses were stuck with sub-10 Mbps broadband.

In another report that flags poor connectivity as a significant issue for the citythe University of Hull concluded:

“Currently, the region finds itself towards the bottom of the league for most key metrics related to economics, skills, employment, social mobility, entrepreneurship and innovation.”

KCOM has committed to ramping up its roll out of Lightstream, which the company says is up to 25 times faster than copper cable broadband. It’ll be available to around three quarters of properties within its network over the next 18 months.

In the meantime, though, Hull is set to gain ‘Gigabit City’ status, thanks to a new, large-scale fibre roll out by CityFibre, which is partnering with young local KCOM competitor Pure Broadband.

Barack Obama likened the availability of super-fast, fibre-optic internet to that of “being the first city to have fire”.

He said that these internet speeds are akin to “unleashing a tornado of innovation” and many cities across the world are working out how they can get a slice of the action.

CityFibre claims its network offers speeds up to 1,000 Mbps (1 Gbps) and says its network is future-proofed to be able to allow for ever-greater capacity. It’ll soon be laying fibre across 62 km of the city to try to compete directly with KCOM’s effort.

The company has already upgraded most of the city to 4G, having installed fibre connections to mobile masts throughout Hull, in partnership with EE and Three.

Hull is now home to incubator and business innovation space C4DI and is a key city that could benefit from the government’s Northern Powerhouse initiative.

Let’s hope it doesn’t continue to be held back by the slow web speeds identified by Ofcom as Europe’s gaze lands there during its City of Culture year.

The University of Hull’s State of the Humber Economy report suggests that the city “plan to actively support entrepreneurship and innovation”.

If Barack Obama is to be believed, becoming a Gigabit City is most of the job done.

Linux is 25 years old today – so is it still the future of computing?

Written as editor of the New Statesman’s NS Tech and first published here.

Linux is probably the only operating system that all of us use every day, but only some of us actually know it.

Its creator, Linus Torvalds, first posted about his work on this new, freeOS back in 1991 but said it was “just a hobby, won’t be big”. How wrong, or perhaps humble, he was.

Everyone from Google to IBM, NASA to the New York Stock Exchange, uses the open source software in one shape or another. But, legend has it, self-effacing developer that he is, Torvalds initially shied away from using part of his name in naming this new product.

“I’m so glad it didn’t end up being called Freax,” says Martin Percival, senior solutions architect at Red Hat, probably open source’s biggest commercial success story, built right on top of Linux. “Had Linus got his way, Linux may not have become such a success with the companies that it has.”

One of those companies is, of course, Red Hat, which is credited with making Linux enterprise-ready with its Red Hat Enterprise Linux operating system. “If it weren’t for the rise of Linux, Facebook and Google and so many others would have had such a harder time getting scale,” Percival says.

For Red Hat, its distribution of Linux helped the company reach annual revenues of $1.79 billion last year. It’s a profitable, open source company that works on a subscription model – giving you the software for free in exchange for ongoing technical support.

As well as testing new bits of Linux to ensure they’re compatible with your existing infrastructure, that also means ensuring its users are protected from patent trolls who try to find companies accidentally using proprietary code.

Open source, open society, open job roles

As open source has grown, it seems to have happened in sync with society becoming more open. Indeed, these trends may well have fed off each other.

Red Hat’s current CEO Jim Whitehurst, author of The Open Organization, talks in public as much about openness in companies and society as he does about open source.

“Society has changed a lot over 20 to 30 years,” Percival agrees. “There’s a generation of folks coming through who connect more, are more open with peers, share more and that naturally flows through into the workplace.

“They’re confused by organisations that say ‘you’re not allowed to pull apart this thing we’ve provided and if you do, to fix a problem, we’ll sue you’. That’s an insane thing to say to an organisation that’s fixed a problem.”

And where big, historically proprietary companies are letting professional staff go left, right and centreopen source jobs are growing. Not least because so many enterprises now use open source in much of their operations.

“Open source has reached a point where it’s relied upon to do substantial amounts of infrastructure we rely on,” Percival says.

“It’s now good enough, robust enough, reliable enough, so why would most organisations go off and build giant teams to create solutions for themselves when they can build open source or on the public cloud?

“It’s not great for the poor folks who are being put out of business by robust open source software. But that just means they have to redeploy their teams in a more agile way to build software with a business value that sits on top of it.”

That feud

By the early 2000s, Dell, IBM, HP and Oracle had all already lent their support to Red Hat in one way or another. Indeed, Intel is the largest contributor of code to the underlying Linux kernel. That’s not least because the creation of this new OS profoundly shaped how Intel designed its chip sets. 

But Red Hat, and no doubt the entire open source community, did had a long-running feud with Microsoft around the principles that technology is built upon.

So much so that Red Hat’s first CEO Bob Young wrote in 2000:

“The software industry that Microsoft has been the role model for is built on the premise that customers are not to be trusted with the technology that they are building their organizations on.

“The legacy software industry is built on the proprietary binary-only model where not only does the user not get the source code he needs to make changes, but worse he receives the product under a license that essentially says that if you make any improvements to the technology you are using, if you solve a bug that is causing your systems to crash, or add a feature that your users or customers desperately need the vendor can have you thrown in jail. (If you don’t believe me, just read any shrinkwrapped software license).

“This kind of business model, where the customer is completely beholden to his supplier exists in no other industry in any free market that I know of. It harks back to the old feudal systems of 12th century Europe.”

Then, just two months ago, Red Hat made a historic deal with Microsoft. “That was unheard of and pretty unimaginable five years ago, ” Percival admits.

“Microsoft has come under pressure to respond to the challenges that open source has thrown up and now it’s going faster in this direction than it has for a long time. And everybody benefits.

“Innovations in the marketplace, companies like Hadoop and Docker, all of that has used open source. The challenge to companies selling proprietary software is: why have you not spotted this trend? Of course, it’s the innovator’s dilemma and there’s a natural tendency to cling on just too long.”

As a community, Linux dealt with the issues thrown up with the rise and rise of virtualisation, it also took a bet on containers two years ago and now they’re everywhere.

Perhaps its largest challenge today is that it’s so widely used, hackers are on the hunt for vulnerabilities. A flaw found earlier this month puts the world’s billion-or-so Android users at risk. And that really also means every organisation that’s built on Linux too.

Seeing Fuchsia

The latest from Google is that, unlike the Android operating system, its upcoming, custom-built OS Fuchsia will no longer be built on the Linux kernel. The Register’s analysis of what we know so far speculates:

“If it can create a fully optimized platform for each key emerging area of connected experience, and then marry them all together at the applications layer with the ubiquitous Android, it might achieve what Unix, Linux and Java promised, but failed to deliver all these years.”

Yes, Linux has done much for many, but now Google appears to be hedging its bets on a new approach to open source development. It remains to be seen whether developers would be willing to trust a company over the community-driven Linux effort.

So what’s on the horizon for the world’s most popular OS? Speaking to the criticism that it hasn’t quite built an ‘any device, anywhere OS’, Percival says:

“There will be tweaks to Linux to better handle stuff in the IoT space, but that’ll largely be handled in a layer above the OS that deals with more specialised problems. On AI, we’re working out how you build an AI layer that meets the needs of business: how do you make it fast enough? Do you really need it in the Linux layer?”

On launching its 25th birthday report, which found that 13,500 people from 1,300 have so far been identified as contributors to this highly ambitious project – that had surprisingly modest beginnings – Jim Zemlin, executive director of The Linux Foundation, said:

“Even after 25 years, Linux still serves as an example of how collaborative development can work, which can be applied to other open source projects.”

Whether the Linux kernel remains as popular as it is today in another 25 years, it’s surely the possibility of transparency, participation and community in tech and beyond that should be its lasting legacy.

Leading gay, lesbian and trans techies went to the White House yesterday – a great tech news story

Written as editor of the New Statesman’s NS Tech and first published here.

Amazon, New Relic and Netflix staff were among an audience of almost 200 leading technologists invited to the White House to talk about how the LGBTIQ community can help “tackle some of the world’s biggest challenges”.

The White House LGBT Tech & Innovation Briefing is sponsored by the (lesbian) US government CTO Megan Smith, who used to be VP of Google’s moonshot division [x], along with Lesbians Who Tech CEO Leanne Pittsford.

Speaking at the event, Smith said:

“Equality, diversity, justice, inclusion and innovation, all of these topics, which have always been part of our country and are very much live right now… are things that require all of us to be involved.”

This year’s event was designed to gather an audience that was 50 per cent people of colour, 50 per cent women and 20 per cent non-gender-binary people.

Smith urged techies to join the government, citing APIs and open-source as just two technologies the administration wholly supports. “Please come do that. Or add these things to your products,” she said.

She also flagged The Opportunity Project, which brings together US government datasets to help citizens and others can build data-driven community tools.

The group was there not just to discuss how tech and innovation can be used to tackle issues experienced by minority groups, though “LGBT and racial diversity inclusion in the tech industry” was one of the key topics.

.@WhiteHouse working together on tough problems that face #lgbtq in tech and inclusive policy

— Ana Arriola (@arriola) August 24, 2016

Preventing gun violence, big data and privacy, democratic representation issues, prison reform and environmental concerns were all worked on during the day too.

The event is building up to an inclusive innovation conference being created by Lesbians Who Tech this November.

Forget buying an artificial intelligence startup, here’s the real issue you have with AI

Written as editor of the New Statesman’s NS Tech and first published here.

Google, just like every other tech company out there, is focused on becoming an artificial intelligence company.

Its CEO Sundar Pichai has said as much and startup buying trends since 2011 have proven that if you’re not trying to integrate new companies and then innovating on AI, you’re wrong.

So what’s the biggest problem facing AI innovators everywhere?

According to Jeff Dean, head of the machine-learning team Google Brain, one of the company’s key research areas within the 1,000-strong Research at Google team, it’s diversity.

Speaking as part of a Reddit Ask Me Anything, when asked if he thought an AI apocalypse was linked to a lack of humanistic thinking, and a lack of diversity, he said:

“I am personally not worried about an AI apocalypse, as I consider that a completely made-up fear. There are legitimate concerns around AI safety and policy, and our group (in collaboration with a number of other organizations) has recently published an Arxiv paper about some of these (see Concrete Problems in AI Safety ). I amconcerned about the lack of diversity in the AI research community and in computer science more generally.”

Of a team of around 35, only three of the Google Brain researchers are women. There are, in fact, as many men called Jeff (OK, including Geoff) on the team.

There are perhaps 10 black or brown faces, however, using my rudimentary human intelligence, it isn’t easy to tell if any are from the LGBTIQ or disabled communities.

Dean is clearly aware of the shortcoming this creates, particularly as his team’s whole focus is to ‘make machines intelligent. Improve people’s lives’, the latter part of which is focused on creating good human outcomes.

Although lacking in much diversity, his team is interdisciplinary, with physicists, mathematicians, biologists, neuroscientists, electrical engineers, computer scientists and even has a philosophy grad.

Why is diversity important to him?

“In my experience, whenever you bring people together with different kinds of expertise, different perspectives, etc., you end up achieving things that none of you could do individually, because no one person has the entire skills and perspective necessary.”

He wants diversity not least because his team will perform better, particularly with more women in it, but because women are also more empathetic, which lends itself well to the social mission of Google Brain, and also because diverse voices add valuable experience.

The AI, it seems, is the simple part.

Via Re/code

Apprenticeship levy still getting lukewarm reception from business

Written as editor of the New Statesman’s NS Tech and first published here.

The government has just unveiled more detail on its intention to apply a levy on employers with a payroll of £3 million or more in order to get more businesses training apprentices.

This is in a (reasonably arbitrary) bid to get three million people in England skilled up by 2020, but has been designed to tackle the fact that investment in this area has been inconsistent.

The compulsory levy represents a new tax on businesses, one that former Chancellor George Osborne reckoned would bring in £3 billion per year. And depending on who you ask, this a bad move, a good step, or something that just doesn’t go far enough.

Money back

Of around five million businesses in the UK, around two per cent, or 100,000, will fall into the levy system.

The 0.5 per cent tax on the total amount paid to staff working in England is expected to arrive alongside a credit that offsets the first £15,000 levied and a 10 per cent top up from government each month.

That would actually see those companies that only just pass the threshold paying nothing at all.

But for big businesses, take Vodafone, with annual staff pay of almost £500 million, and the bill is likely to run into millions of pounds each year. That doesn’t even cover the apprentice’s salary and if the levy fund goes unspent, it expires after 18 months.

In theory, the maximum cost of training is £27,000 per apprentice, depending on the role, with companies able to directly select what provider they pay for what training.

The Institute of Directors (IOD) has yet again called on the government to stop the plan, not least because of business uncertainty post-Brexit.

Last month, it was also joined by EEF, the UK’s largest manufacturing employers, the CBI and the Charity Finance Group, as third-sector organisations will also be hit, in urging the government to rethink.

“Our members are fully in favour of the levy in principle,” Seamus Nevin, head of skills and employment at the IoD, told NS Tech. “They know they have to step up to plate and train staff.

“But we’re asking for it to be postponed in order to allow time for the government to engage more with employers.

“We’re worried that the three million figure will just become a box-ticking exercise that sees three million apprenticeship starters, rather than people finishing apprenticeships.”

He questions whether quantity will be prioritised over quality, and points to anomalies that mean that things like charities and academy chains will be swept up into the system.

Boost for small business

For small businesses who aren’t hit by the levy, or those with insufficient funds built up to meet the cost of training an apprentice, the government is asking them to pay just 10 per cent of the cost and it will make up the difference.

This means for small businesses, the training for many digital apprenticeships would be 90 per cent funded by government up to the value of £15,000, £18,000 or £27,000, depending on the role.

All of the new apprenticeship standards have been created in consultation with business, so getting yourself a swanky new infrastructure technician, who’d be trained on a curriculum designed by the likes of Microsoft, IBM, Cisco and BT, would see the government cover 90 per cent of your costs up to £18,000.

Cuts to digital?

One criticism of the previous system has been that apprentice training costs have floated towards the top of the brackets assigned by government for different jobs, so new ones have now been outlined that it says will see “maximum value for the tax payer”.

But that means most of the recognised ‘digital jobs’ have had the maximum estimated cost significantly decreased, meaning training providers might be required to do more with less.

“We are disappointed to see that the introduction of the new banding has meant funding for a diverse range of IT apprenticeship standards has reduced significantly, in some cases by as much as a third,” said Lucy Ireland, deputy CEO of BCS learning and development, part of BCS, The Chartered Institute for IT.

“On balance, this is a big step forward,” Anthony Impey, CEO of Optimity, which helped design the new government apprenticeships.

“For a long time, the apprenticeship system has been stacked in favour of the training provider – and there are two losers in that – employers can’t get who they need and learners haven’t always had a great experience.

“Overall, we as employers have to take responsibility for gaps in job market – and because this kind of training has a direct benefit to us. I don’t take on apprentices as an altruistic effort, it’s good business.”

If a small business with fewer than 50 employees takes on an apprentice who’s aged 16 to 18, the government will waive payment altogether and give them an additional £1,000 to cover the cost of getting them up to speed.

This is designed to address concerns flagged by the Federation of Small Businesses that apprentices can represent a cost if it takes a lot of time to manage their new team member.

Not far enough?

Jonathan Clifton, associate director for public services from the IPPR, believes that the latest proposals, don’t go far enough.

“The government is absolutely right to introduce an apprenticeship levy. Following Brexit, British employers may not be able to rely on recruiting migrant workers to fill skills gaps – so we’ll need more apprenticeships to train up our domestic workforce.

“Today’s announcement is a step in the right direction – but it does not go far enough. The proposed apprenticeship levy will still only cover 2% of employers. In the long term, the government should expand the levy to cover all employers – because every firm has a role to play in training up the next generation.”

Interested parties have until 5 September to respond to the latest proposals, with a proposed October launch of the final plan.

Forget Hinkley Point, open data could save us from climate change

Written as editor of the New Statesman’s NS Tech and first published here.

You might not be familiar with heat networks yet, but they are a key part of the government’s strategy to cut the UK’s energy use, particularly in London.

The idea is that you do away with individual boilers and instead have a centralised system that supplies a number of homes through a network of pipes.

District heat networks already supply the majority of homes in Denmark with hot water and heating from a shared boiler system. And, little do most people know, localised heat networks are heating around 2 per cent of people’s homes in the UK today.

The greener the fuel that’s used to power the centralised boiler, the greener the system. That’s anything from gas, to biomass or solar, even waste heat from industrial processes, like the London Underground.

By 2025, one quarter of all London properties are expected to be using this kind of system, largely because developers have to prove they’re providing low-carbon heating.

So what has all this got to do with tech?

Cleantech startup Guru Systemsreceived funding from the Department of Energy and Climate Change to explore the use of open data to make heat networks better.

Using the metering and monitoring technology in their networks, the company identified inefficiencies that, if addressed, could save the energy market 800,000 tonnes of CO2 – and £400 million – over 10 years.

“The majority of the £400 million in projected savings comes from a reduction in the over-sizing of networks as well as increased fuel efficiency across the lifetime of these new systems,” Casey Cole, MD of Guru Systems, said.

“Designers currently use an outdated model to calculate the most amount of heat needed at any one time and this has lead to networks being drastically oversized to meet demand they will never actually experience.”

Guru has now created a web-based platform called Pinpoint that displays network performance in real-time and works with a machine-learning algorithm to improve it. It can pinpoint a problem down to the specific house in the network and also suggest cost savings that could be made by tweaking the system.

The data from the initial project has also been opened up by Guru for housing developers to use, which it believes could save 30 per cent on the build cost of the network. Residents involved in the pilot saw their heating costs halved from 7.7p to 3.8p per kWh.

“It’s great to see this evidence of how the clever use of data and opening data to others can save money, enable new approaches and help us all to live lives that are more sustainable and efficient,” Jeni Tennison, technical director at the Open Data Institute, which also supported the project, said.

“Guru Systems is not only using data to bring benefits to its immediate customers, by opening up data they are providing information to the market as a whole that could have significant economic and environmental impacts on a macro scale.”

The government is currently conducting a consultation on heat networks and is set to invest £320 million into the development of these projects.

A full case study of the project can be found here.