Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech Guides

852 Articles
article-image-what-are-skills-you-need-earn-more
Packt Publishing
27 Jul 2015
2 min read
Save for later

What are the skills you need to earn more?

Packt Publishing
27 Jul 2015
2 min read
In the 2015 Skill Up Survey, Packt talked to more than 20,000 people who work in IT globally to identify what skills are valued in technical roles and what trends are changing and emerging. Responses from the Data Science and Business Intelligence community provided us with some great insight into how skills are rated across multiple industries, job roles and experience levels. The world of Data Science is highly varied, and can be super competitive too, so we wanted to find out what industries are best for those just entering the market. We also discovered which technologies are proving to be most popular and where can you earn the best salaries. We also had some specific questions for which we wanted answers:- Are Python and R still the key data science languages? What are the hottest trends? Which is the most lucrative sector for those with Data Science skills. Which cutting edge technologies should you invest in learning? We received a whopping 3,800 responses from the Data Science community. The largest number of respondents were Consultants who, with over 20 years experience on average, provided some great insight into the Data Science world. Our survey identified which industry sectors pay the most for those with less experience and also showed up some big differences in average salaries across industries. We also wanted to look at the technologies our community are using. More than 25% of our respondents use Python, but is it here to stay, or will it be overtaken by Julia in the next 12 months? The results were interesting and suggested that salaries influence responses! Unsurprisingly, Big Data emerged as a Hot Topic. When asked whether their company was planning to implement a Big Data project over the next 12 months, it seems that the response was tied in to experience levels too. And of course, we just had to ask - does Excel still hold a place in your heart? Can you guess what the answer was? Read the rest of the report to see what skills you need to build on and which technologies are poised to take the Data Science and Business Intelligence world by storm so you can get ahead of the competition. Click here to download the full report Find out more about our exclusive Skill Up offers and discounts
Read more
  • 0
  • 0
  • 2395

article-image-how-game-development-giants-are-leading-vr-revolution
Casey Borders
28 Sep 2016
5 min read
Save for later

How Game Development Giants are Leading the VR Revolution

Casey Borders
28 Sep 2016
5 min read
We are still very much in the early days of virtual reality, with only two real choices for VR hardware. They are the HTC Vive and the Oculus Rift, both of which have only been available for less than a year. As you would expect, there are a wide range of VR games and experiences being built by indie developers, since they tend to be the earliest adopters, but there has also been a lot of investment and support from much larger names, most notably, Valve. Valve has been in the games industry for almost 20 years and is the powerhouse behind the HTC Vive. Starting back in 2012 they brainstormed and prototyped early versions of their VR headset. Teaming up with HTC in the spring of 2014 took the Vive hardware to the next level and turned it into what we see today. In addition to playing a huge role in the development of the Vive hardware, they also built SteamVR, which is the software platform that powers the Vive. In fact, SteamVR supports both the HTC Vive and the Oculus Rift. This allows developers to target either platform using the same SDK. Valve has always said that they won't lock down SteamVR because they believe that being restrictive at this early stage would be bad for the industry. Valve has also been the producer of a sizable amount of VR content. Just after Oculus released their first development kit, Vive put out versions of Team Fortress 2 and Half Life 2 that supported the Rift, and they were amazing. For a lot of people, these games were their first experience with VR and it set the bar for how immersive and compelling it can be. They have also built the definitive experience for the HTC Vive with a game called The Lab. Actually, it's a collection of small games and demos allowing players to have a wide range of highly polished AAA-level experiences. Based on community feedback they've actually released the source code for some of the more popular features so that other developers can use them in their own games. Other big game developers are starting to follow Valve's lead into VR. This year, at E3, Bethesda announced that they are going to release a VR version of their incredibly popular game, Fallout 4. Fallout 4 VR will target the Vive and make use of the Vive's controllers for shooting and managing the Pip-Boy. EA has said that they are targeting the PlayStation VR with their Star Wars™ Battlefront: X-Wing VR Mission. Both of these games are sure to come with all of the polish that we expect from these studios and will be a huge step for bringing full length games into VR. Even large companies from outside of the gaming industry are getting involved in VR. Oculus launched their Kickstarter project back in August of 2012 with a truly visionary take on what VR could become with modern hardware. Their Kickstarter went gangbusters and, for a goal of $250,000, they ended up with $2.4 million from backers! They spent the next two years prototyping, iterating and refining their idea and along the way released two development kits. They built a name for themselves and started to get the attention of some of the biggest names in tech. To pretty much everyone's surprise, in the Spring of 2014, Facebook bought them for $2 Billion. The release of the first Oculus Rift development kit sparked the creativity of some amazingly clever Google engineers who decided to see what they could come up with using almost no money. They figured out that you could turn $10 worth of cardboard and plastic lenses into a VR viewer powered by a smartphone. Their incredibly cheap and simple solution did a surprising job of delivering on the VR experience. This year, at Google I/O 2016, Google announced that they were going to take a step up from Cardboard with a project called DayDream. DayDream is a VR platform that is going to be built directly into Android starting with version 7.0 (Nougat). It's still going to be powered by a mobile phone but the viewer is going to be much sturdier and will come with a controller to allow users to interact with the virtual world. VR is just beginning to pick up steam and yet there are already a ton of great experiences from indie developers and mega studios alike. As lower cost headsets become available and the price of existing hardware drops I think we'll see more and more people brining VR into their homes. This will create a much riper audience for larger AAA studios to take their first steps into the virtual world. About the author Casey Borders is an avid gamer and VR fan with over 10 years of experience with graphics development. He was worked on everything from military simulation to educational VR/AR experiences to game development. More recently, he has focused mainly on mobile development.
Read more
  • 0
  • 0
  • 2360

article-image-2014-the-hackathon-to-remember
Julian Ursell
13 Feb 2015
4 min read
Save for later

2014: The Year of the Hackathon to Remember

Julian Ursell
13 Feb 2015
4 min read
2014 was the year Kim Jong-Un watched you undress through your laptop web camera. Well, not quite, but at times it was almost as worrying. It did see some big plays from the black hats as they set out to pillage, obstruct and generally embarrass major corporations to entire countries, as well as hacktivists intervening as crusaders of social justice. The hacks ranged from petty DDoS attacks to politically charged hacking threats, to all out sex offences. There was also cross-fighting between hackers. The very real phenomenon has been the bane of security professionals' existence, permeating the international consciousness in perhaps the most prominent hacking 'wave' in recent memory. One can't deny the position of power the wily hacker possess right now, and we saw this in many different ways throughout the last year. Was there really a grand context behind it all? Let's look back at the events of 2014, and what they meant, if anything at all. It wasn't exactly a hack, but the Heartbleed vulnerability to the security software OpenSSL was one of the major spooks of the year, prompting a hysteria of password changes and security experts on the breakfast news. Several major websites and applications using an implementation of OpenSSL were affected to varying degrees, including Facebook, Instagram, Netflix and Gmail, although what it amounted to erred more often than not on cautionary advice rather than ultimatums on password changes, as many sites rapidly rolled out security patches. The majority seemed to have experienced no serious security breaches or malicious activity, seemingly catching the bug before hacker groups could really go to town. However, perhaps the strongest underlining to the whole debacle was the resonance it had in the open versus closed source security software debate. That there was a vulnerability lurking within the code of OpenSSL for two years was a hugely embarrassing oversight, bookended with the flood of attacks on servers made possible by the Shellshock bug at the end of the year. In the immediate future there will be a long, hard look at open source security, ensuring that the way in which the software is developed is in itself secure, and weighing up a greater potential interaction between open source and corporate funding. August was a turbulent month for hacking, for hugely different reasons and on separate parts of the spectrum (it ain't just black and white, right?). The celebrity hacking scandal affectionately dubbed 'the Fappening' was responsible for the theft and leak of explicit media of several well known celebrities, and was a big kick in the teeth for Apple's cloud storage service, iCloud. The internet was awash with panic as well as guidance about securing iCloud, putting the scare into people that malicious hackers could reach past the security mechanisms of technological corporations as sophisticated as Apple. It was also the month where hacktivism played a powerful role in real world, unfolding events, as Anonymous intervened in the tense stand off in Ferguson, USA, following the shooting of Michael Brown. As is Anonymous' typical modus operandi, they threatened the police with the release of sensitive information to the public (a method known as doxing), should they not reveal the officer responsible for the killing. However, in the pursuit of social, moral and political justice, Anonymous had to deal with a splinter in its own ranks, as a member was found to have misidentified the officer, forcing the group to swiftly denounce the loose cannon and its misinformation. We saw last year hacking as yet again an activist vessel wielded in defence of justice, demonstrating how cyber space has become a significant dimension in real world events. Finally, on to Christmas, Lizard Squad had their fun making Xbox and Playstation gamers cry ( subsequently triggering a war with Anonymous), but the obvious big story was the furore over Sony's The Interview as its depiction of the North Korean leader's demise wasn't taken with the light hearted grace that I'm sure was previously shown for Kim Jong-Il's even handed representation in Team America. Sufficiently terrified by a threat in broken English and following the overture of one of the worst corporate network hacks in history, Sony backed down and pulled the film, then partially reneged by making it available through VOD, even prompting some to suggest the whole thing was a deliberate conspiracy (which was of course a whole load of hash). Anonymous, the Guardians of Peace, Lizard Squad; 2014 was the year the hackers really pushed all the buttons and got (for the most part) what they wanted. How the world deals with the black hats, the white hats, the hacktivists, the trouble makers in the future will be intriguing for sure.
Read more
  • 0
  • 0
  • 2351
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-2014-you-want-a-container-with-that
Julian Ursell
09 Jan 2015
4 min read
Save for later

2014: You Want A Container With That?

Julian Ursell
09 Jan 2015
4 min read
2014 Roundup: Networking, Virtualization and Cloud 2014 has been a year of immense movement in the networking, virtualization and cloud space. DevOps, infrastructure-as-code, containerisation, configuration management, lightweight Linux, hybrid cloud; all of these concepts carry a certain gravitas right now and will do even more so as we enter 2015. Let's take a look at the major developments over the past year, and what this could mean heading into the immediate future. DevOps continues to transform software development and systems programming, with the rise of configuration management tools such as Ansible and SaltStack, and the expansion of incumbent config management tools, Chef and Puppet (Puppet looks set to release version 4.0 some time early next year, and announced its new Puppet Server project at PuppetConf recently). Hashicorp, prolific in the DevOps space (creator of Vagrant) has intriguingly unified all five of its open source projects under the umbrella of an all-in-one DevOps tool they have anointed Atlas. With the emergence of DevOps-oriented technologies geared towards transforming infrastructure into code and automating literally everything, developers and engineers have begun to approach projects from a broader perspective, speaking the new universal language of the DevOps engineer. Up in the clouds, the arena of competition among vendors has centred on the drive to develop hybrid solutions which will enable enterprises to take advantage of the heft behind open source platforms such as OpenStack, while combining public and private cloud environments. We've seen this with Red Hat's tuning of its own enterprise version of OpenStack for hybrid setups, and most recently with the announcement that Microsoft and Accenture are re-energising their alliance by combining their cloud technologies to provide a comprehensive super-stack platform which comprises Windows Azure, Server, System Center and the Accenture Cloud Platform. Big plays indeed. If there is a superstar this year, it has to be Docker. It has been the arranging metaphor for the development of a myriad of exciting new technologies, conceptual re-thinking, as well as a flurry of development announcements on both open source and enterprise fronts. The eventual release of Docker 1.0 was greeted to fanfare and rapture at DockerCon this June, after about a year of development and buzz, and surrounding the release has been a parallel drive to iterate on the lessons learned to create complimentary technologies and platforms. It has then been largely due to Docker that we've started to see the emergence of new Linux operating system architectures which are lightweight, and purpose built for massively scaled distributed computing, such as CoreOS, and Red Hat's prototype, Project Atomic. Both leverage Docker containers. The team behind CoreOS are even looking to rival Docker, with their prototype container runtime, Rocket, which promises to deliver an alternative approach to containerisation that they argue returns to the simplicity of the original manifesto for Docker containers as a composable, modular building block with a carefully designed specification. Google open sourced its Docker container orchestration tool Kubernetes, and  even Windows has jumped quickly on to the Docker train, with the development of a compatible Docker client for the next release of Windows Server. A year ago, Mitchell Hashimoto coined the term 'FutureOps' for a vision of 'immutable infrastructures', servers pre-built with images which replace the need for continual configuration and replacement, enable lightning fast server deployments, and that provision automatic recovery mechanisms capable of detecting and anticipating server failures. Considered the height of idealism to some who would argue that systems can never be immutable, whether this is an achievable reality or not, the changes and developments of the last year would seem to inch closer to making it happen. Docker is part of this big picture, whatever shape that may take - 2014 was the year it made big waves, the magnitude of which will undoubtedly continue into 2015 and beyond.
Read more
  • 0
  • 0
  • 2347

article-image-5-rising-stars-you-may-not-have-been-watching-2015
Sarah C
16 Feb 2015
3 min read
Save for later

5 Rising Stars (you may not have been watching) for 2015

Sarah C
16 Feb 2015
3 min read
At the end of the year, we ran through the things that made the biggest difference for front-end developers in 2014. But what about the future? In 2015, what’s the state of play for some of the most important new (and old) technology in web development, and what is there to look forward too? Here’s the run down of some projects in which we’re invested for the coming twelve months. Node/Io.js At the end of last year, Node forked due to a difference of opinion on how to run things. Some thought this spelled disaster. In our opinion? Node’s a great project that’s only going to get better. The Io branch will add innovation, whether the projects continue to exist in parallel or merge again downstream. (The first official release in January already looks meaty.) Perhaps even more important than the tech itself – the future of Node/Io will be add to the annals of open-source history. If the two projects can reach an amiable consensus we’ll have a great exemplar for ethical open-source and enterprise interdependence. Neo4J Neo4J is the datastore that stole the show for anyone trying to work with social data. Neo4J’s graph database fundamentally changes the way we think about and use relationship modelling. This isn’t a blip or a quaint hobby – the kind of information that graph databases can deliver is changing the way we use the web. Neo4J’s devs are anticipating that a quarter of all enterprises will be using tech like theirs within three years. This year they’re investing that $20m they just made in increasing adoption. And we’re expecting to see plenty of developers investing their time in learning the ropes. Angular2.0 The news on Angular 2.0 is so big, we’ve got a whole entire post on it. Go see what Ed G has to say about one of the biggest things to hit JavaScript since jQuery. PHP About once every minute somebody stops at my desk, crosses the street, or books a round ticket from Australia to tell me PHP is dead. They’re entirely wrong. In fact, in 2015 we should be getting our first look at PHP7. (We’re skipping 6, best not to ask). 2015 is set to be a good year for PHP. Now that the specifications are in place, the devs are ready to roll out new features for a new generation of PHP: there’ll be a whole new core for a faster, more modern take on the classic language. GruntJS A smaller player on this field, but one that’s been punching way above its weight for some time – will 2015 give us GruntJS 1.0? Grunt is a JavaScript task runner. It doesn’t even have its own Wikipedia page right now, and yet we’re hearing nothing but enthusiasm for how much it helps with workflow. Grunt built a better mouse-trap, and the world’s started beating a path to its door. Now Grunt’s developers are working on making everything more modular with easier integration and dependency management. We’re as excited for their future as they are.
Read more
  • 0
  • 0
  • 2346

article-image-what-tableau-can-do-your-data
Joshua Milligan
10 Sep 2015
4 min read
Save for later

What Tableau can do for your data

Joshua Milligan
10 Sep 2015
4 min read
Joshua Milligan is a Tableau Zen master, trainer and senior data consultant at Teknion data solutions, a company dedicated to empowering people to leverage their data. He is also author of the wildly successful Learning Tableau. In using Tableau Joshua has a great insight into all the different hats a Data Scientist needs to wear. Here he explains a bit about his craft and the range of skills you really need to master as a data scientist, becoming analyst, designer, architect, mentor and detective all at once, and how Tableau can help you be all of them. Early in my career, when I was developing software, I remember one of my mentors telling me that everything was about the data. Lines of code were written to read data, move it, create it, and store it. I wrote a lot of lines of code and it really was all about the data. However, over the past decade, I’ve come to a new appreciation for the meaning of data and how being able to see and understand it opens up a whole new world of insight. Over the years I’ve done everything from data modeling, analysis, ETL, and data visualization. I’ve used a lot of different tools, systems, and platforms. But the tool that really ignited my passion for data is Tableau. Tableau is a data visualization and analytics platform which allows me to build visualizations and interactive dashboards by dragging and dropping fields onto a canvas. This gives me hands-on interaction with the data. The tool becomes transparent and my attention can be focused on asking and answering questions using hands-on interaction with the data.   As a consultant, I’m asked to play a variety of roles and I use Tableau in almost all of them. Some of these roles include: Data Detective: Tableau allows me to quickly connect to one or more data sources and rapidly explore. I’m especially interested in the quality of the data and discovering what kinds of questions can even be asked of a particular data set. Data Analyst: I use Tableau’s visualization and statistical capabilities to carefully examine data, looking for patterns, correlations, and outliers. I may start with an initial set of questions given to me by business users or executives. The analysis yields answers and additional questions for deeper understanding. Dashboard Designer: Tableau allows me to turn my analysis into fully interactive dashboards. These can be anything, from simple KPI indicators allowing executives to make time-sensitive decisions to analytical dashboards allowing managers and team-members to drill into the details. Tableau Architect: Tableau truly gives everyone the ability to perform self-service business intelligence. I work with organizations to help craft strategies around Tableau so they can go from having individuals or departments who are overwhelmed with creating reports for the entire organization and instead open up the data, in a secure and structured way, so that everyone can generate their own insights. Trainer and Mentor: Tableau is so intuitive and transparent that business users can very quickly use it – often creating their first visualizations and dashboards in minutes. The greatest fulfillment in my career is sitting down with others or teaching a class of people to empower them to understand their data. My experience with Tableau and my interaction with other users led me to write the book Learning Tableau. My goal was to give readers, whether they were beginners or had been using Tableau for a few years, a solid foundation for understanding how Tableau works with data and how to use the tool to better understand their data.
Read more
  • 0
  • 0
  • 2246
article-image-google-distill-everything-data-pros-need-know
Xavier Bruhiere
24 May 2017
6 min read
Save for later

Google Distill: Everything data pros need to know

Xavier Bruhiere
24 May 2017
6 min read
  Momentum works In 1984, the idea of an AI winter (a period where deep research in this artificial intelligence almost completely stops) appeared. The interesting thing is that it emerged both from technical disillusions, and bad press coverage. Scientists were woeful after the initial promises of artificial intelligence funding stopped, and everything broke apart for a while. This example serves as a solemn statement; the way research is communicated matters a great deal for development. Through the lens of Distill, a publishing platform for machine learning papers, we will explore why. Then, we shall see why you should care if you embrace data for a living. How we share knowledge Engineering is a subtle alchemy of knowledge and experience. The convenient state of the industry is that we are keen to share both online. It builds carriers, relationships, helps in the hiring process, enhances companies’ image, etc. The process is not perfect, but once the sources are right, it starts to become addictive. Yet, shocking exceptions occur in the middle of this never-ending stream of exciting articles that crash forever in Pocket. A majority of the content is beautifully published on carefully designed engineering blogs, or on Medium, and takes less than 10 minutes to read. Yet, sometimes the click-bait title gets you redirected on a black and white PDF with dense text and low-resolution images. We can feel the cognitive overhead of going through such papers. For one, they are usually more in-depth analyses. Putting in the effort to understand them when the material is also less practical and the content harder to parse, can distract readers. This is unfortunate because practical projects are growing in complexity, and engineers are expected to deliver deep expertise on the edge of new technologies like machine learning. So, maybe you care about consuming knowledge resulting from three years of intensive research and countless more years of collaborative work. Or maybe you want your paper to help your peers to progress, build upon it, and recognize you as a valuable contributor to their field. If so, then the Distill publishing platform could really help you out. What is Distill? Before exploring the how, let's detail the what. As explained in the footer of the website: "Distill is dedicated to clear explanations of machine learning." Or to paraphrase Michael Nielsen from Hacker News, "In a nutshell, Distill is an interactive, visual journal for machine learning research." Thanks to modern web tools, authors can craft and submit papers for review before being published. The project was founded by Chris Olah and Shan Carter, members of the Google Brain team. They put together a Steering Committee composed of venerable leaders from open source, Deep Mind, and YC research. Effectively, they gathered an exciting panel of experts to lead how machine learning knowledge should be spread in 2017. Mike Bostock, for example, is the creator of d3.js, a reference for data visualization, or [Amanda Cox] [7], a remarkable data journalist at the New York Times. Together they shape the concept of "explorable explanations," i.e. articles the reader can interact with. By directly experimenting the theories in the article, the reader can better map their own mental models with the author's explanations. It is also a great opportunity to verbalize complex problems in a form that triggers new insights and seed productive collaboration for future work. I hope it sounds exciting to you, but let's dive into the specifics of why you should care.   Tensorflow Playground Why does it matter? Improving research clarity, range of audience, efficiency, and opportunity of communication should have a beautiful side effect: accelerating research. Chris and Shan explain the pain points they intend to solve in their article Research Debt. As scientists' work piles up, we need, as a community, to transform this ever-growing debt into an opportunity that enables technologies to thrive. Especially today where most of the world wonders how the latest advances in AI will be used. With Distill you should be able to have a flexible medium to base research on clearly and explicitly, and is a great way to improve confidence in our work and push faster and safer beyond the bounds. But as a data professional, you sometimes spend your days not thinking of saving the world from robots—like thinking of yourself and your career. As a reader, I can't encourage you enough to follow the article feed or even the twitter account. There are only a few for now, but they are exceptional (in my humble opinion). Your mileage may vary depending on your experience, or how you apply data for a living, but I bet you can boost your data science skills browsing through the publications. As an author, it provides an opportunity: Reaching the level of quality and empathy of the previous publications Interacting with top-level experts in the community during reviews Gaining an advantageous visibility in machine learning and setting yourself as an expert on highly technical matters Making money by competing for the Distill Prize The articles are lengthy and demanding, but if you work in the field of machine learning, you've probably already accepted that the journey was worth it. The founders of the project have put a lot of effort into making the platform as impactful as possible for the whole community. As an author, as a reader, or even as an open source contributor, you can be part of it and advance both your accomplishments and machine learning state of the art. Distill as a project is also exciting to follow for its intrinsic philosophy and technologies. As engineers,quality blog posts overwhelm us, as we read to keep our minds sharp. The concept of explorable explanations could very well be a significant advancement in how we share knowledge online. Distill proves the technology is there, so this kind of publication could boost how a whole industry digests the flow of technologies. Explanations of really advanced concepts could become easier to grasp for beginners, or more actionable for busy senior engineers.I hope you will reflect on that and pick how Distill can move you forward. About the author Xavier Bruhiereis a Senior Data Engineer at Kpler. He is a curious, sharp, entrepreneur and engineer who has built many projects, broke most of them, and launch and scaled what was left, learning from them all. 
Read more
  • 0
  • 0
  • 2120

article-image-retro-modding-with-micro-computers
Sarah C
08 Dec 2015
4 min read
Save for later

Retro-Modding: Rebuilding the past with consumer micro-computers

Sarah C
08 Dec 2015
4 min read
Micro-controllers lend themselves to all kinds of projects. From home automation to robotics, the long-predicted Internet of Things is finally flashing and beeping its way into our hobby time. Among the most satisfying uses of these tiny boards is to rebuild the great computer milestones of the past. Projects that took teams of expert developers years to create not so very long ago can now be reproduced with a few weeks of tinkering. They’re fun, they’re nostalgic, they’re a great way to learn, and they make fantastic gifts. The number of single-chip boards and micro-computers available is growing all the time, and becoming ever more affordable. (We here at Packt are excited for the Banana-Pi, Galileo, and Humming Board.) For now, there are three main options if you want to get to work on fixing the past for yourself. Speed dating with the three big players: Arduino is a single circuit board with a microprocessor. It’s tiny, it’s light, it’s great for custom firmware – use it to power wearables, LED displays, remote-control devices, or an army of tiny robot slaves. Raspberry Pi is a micro-computer – unlike Arduino, it runs an operating system. Since its memory lives on an SD card, you can build multiple projects for one board and just swap its brains out whenever you want. Raspberry Pi’s support for audio and video makes it great for retro-gaming, media projects, and a wealth of other projects dreamed up by its active user base. BeagleBone is another micro-computer like Raspberry Pi. It’s got a powerful processor and all the connectors in the universe. (Seriously – you can connect anything to it. Twice. The Borg probably run on Beaglebone.) It’s a great fit for powering home automation, robotics projects, or your very own Rube Goldberg-esque media player. Once you’ve chosen your hardware, there are thousands of options for repurposing all those bags of old electronics that you were keeping just in case 1996 rolled back around. Rebuild an old toy Remember when ComQuest was the cutting edge of the toy catalog? Or Speak and Say devices? They’ve long since been outstripped by better tech, but for those of us who grew up with them they still hold a certain power. You never forget that first sense of mystery – how does it work? How does it speak? Why can’t I teach it to call my sister names? Now you can. With an old case and an ARM board you can hack your own childhood as much as you like. Hand-held gaming A common project with micro-computers is to refurbish or rebuild an old Game Boy. With a Raspberry Pi, an emulator, and an old case (or a 3D printer and a sense of adventure) you can make your own Game Boy. Higher resolution, better colours, clearer sound, and any peripherals you like are optional. (You could even go all out and build a Game Boy with a Sega emulator.) Stylish USB keyboards Nowadays the Commodore 64 is slightly less powerful than the remote control for your TV or the thermostat in your fridge. But with a little modification, it makes for a nostalgic and surprisingly comfortable USB keyboard. Computer cases Less about the coding, and more about crafting and perhaps a little soldering – but the robust plastics of old devices can make for great protective casing for the delicate and exposed circuitry of most ARM boards.  One seller on Etsy is even 3D printing replicas of the Apple II to make Raspberry Pi cases. Arcade games Think big. A single Raspberry Pi can power your very own arcade machine with hundreds of games and no need for quarters. Unless you want to incorporate a coin slot – then you can certainly do that too. How you build the case is up to you – sprayed metal, polished mahogany, or cardboard and poster paints are all Raspberry Pi compatible. Beautiful old radios For years before the age of disposable plastics, radios and televisions were designed to be part of your house’s furniture. Scouting around the right markets or websites you can find some truly beautiful broken electronics. Repairing them with original parts is costly when it’s even possible. But Internet radios are some of the simplest problems around for ARM boards. There’s no reason why you can’t combine the aesthetic best of the mid twentieth-century with a state of the art interior for less than the cost of a hipster knock-off transistor.
Read more
  • 0
  • 0
  • 1973

article-image-will-data-scientists-become-victims-automation
Erik Kappelman
10 Sep 2017
5 min read
Save for later

Will data scientists become victims of automation?

Erik Kappelman
10 Sep 2017
5 min read
As someone who considers themselves something of a data scientist, this is an important question. Unfortunately, the answer is: it depends. It is true that some data scientists will be automated out of their usefulness. I’m not a fan of the term ‘data scientist’ for a whole bunch of reasons, not the least of which is its variable definition. For the purposes of this piece, we will define data science using Wikipedia’s definitions: “[Data science] is an interdisciplinary field about scientific methods, processes, and systems to extractknowledge or insights fromdata in various forms, either structured or unstructured.” In short, data scientists are people who practice data science (mind blown, I know). Data science defined Data science can be broadly defined to consist of three categories or processes: data acquisition or mining, data management, and data analysis. At first blush, data scientists don’t seem to be all that automatable. For one thing, data scientists already use automation to great effect, but are still involved in the process because of the creativity that is required for success. Producing creativity In my opinion, creativity is the greatest defense against automation. Although computer technology will get there eventually, producing true creativity is pretty far down the line toward complete artificial intelligence. By the time we get there, we should probably be worried about SkyNet and not job security. At present, automation is best applied to predictable repeated task. If we look at the three elements of data science mentioned earlier and try to broadly apply these criterion for likelihood of automation, we might be able to partially answer the title question. Data mining Data mining is simultaneously ripe for the picking for automators and is also a likely no-automation stronghold, at least in part. Data mining consists of a variety of processes that are often quite tedious. There is also a lot of redundancy or repetition in the performance of data mining. Let's say that there is a government agency collecting metadata on every phone call placed inside a country. Using any number of data mining techniques, a data scientist could use this metadata to pick out all kinds of interesting things, such as relationships between where calls are made, and who the calls are made to. Most of this mining would be performed by algorithms that repeatedly comb new data and old data, connecting points and creating usable information for a seemingly infinite pile of individually useless phone call metadata. So much of this process is already automated, but somebody is still there to create and implement the algorithms that are at the core of the automation process. These algorithms might be specifically or generally focused. They may need to be changed as the needs of the government agency changes. So, even if the process is highly automated, data scientists will still have to be involved in the short to medium term. Data analysis Data analysis sits in a similar place as data mining in terms of likelihood of automation. Data analysis requires a lot of creativity up front and at the end. Data analysts need to come up with a plan to analyze data, implement the plan, and then interpret the results in a meaningful way. Right now, the easiest part of this process to automate is the implementation. Eventually, artificial intelligence will advance enough that AIs can plan, implement, and interpret data analysis completely with no human involvement. I think this is still a long way off (decades even), and again, keep SkyNet in mind (one can never be too careful). Data management Data management seems like it should already be automated. The design of databases does take plenty of creativity, but it's the creative implementation of fairly rigid standards. This is a level of creativity that automation can currently handle. Data storage, queries, and other elements of data management are already well within the grasp of automation routines. So, if there is one area of data science that is going to go before the rest, it is definitely data management. Victims of automation So the answer is yes, data scientists will most likely become victims of automation, but when this happens depends on their specialty or at least their current work responsibilities. This is really true of almost all jobs, so it's not necessarily a very illuminating answer. I would say, however, that data science is a pretty safe bet if you’re worried about automation. Many other professions will be completely gone—I’m looking at you, automated car developers—by the time data scientists even begin to come under fire. Data scientists will become unemployed around the same time lower skilled computer programmers and system administrators are heading to the unemployment line. A few data scientists will continue to be employed until the bitter end. I believe this is true of any profession that involves the creative use of technology. Data scientists should not post their resumes yet. I believe the data science industry will grow for at least the next two decades before automation begins to take its toll. Many data scientists, due to their contributions to automation, will actually be the cause of other people, and perhaps themselves, losing their jobs. But, to end on a happy note, suffice to say, data science is certainly safe for the near future. About the Author Erik Kappelman wears many hats including blogger, developer, data consultant, economist, and transportation planner. He lives in Helena, Montana and works for theDepartment of Transportation as a transportation demand modeler.
Read more
  • 0
  • 0
  • 1811
article-image-5-things-that-matter-tech-2018
Richard Gall
11 Dec 2017
3 min read
Save for later

5 things that matter in tech in 2018

Richard Gall
11 Dec 2017
3 min read
It’s easy to get drawn into cliches around technology that things move quickly. In actual fact, in the tech world what passes for ‘progress’ takes place at different rates in different places. Yes, Silicon Valley companies may be trying to push boundaries, and teenagers are able to hack into even the most secure systems on the planet, but there are still plenty of companies for where data science is a folder of excel spreadsheets, where a cloud infrastructure is a Google Drive system. That makes it hard to say exactly what will matter  in 2018 in tech. But we’re going to try anyway. So, whatever you do, and whatever you’re going to be working on in 2018, here’s 5 things that will definitely matter in 2018… 1. More Empathy It sounds obvious but technology isn’t just a human invention; it’s also a human activity. It’s something people build - together - and something people use. This is a fact that is all too often forgotten in the chaotic reality of modern work. However, if we can all have more empathy for both those around us - other developers, our colleagues - as well as for users, we’ll not only build better software, we’ll also be much happier while doing so. 2. Developing standards and best practices Open source has been a true revolution, impacting not only the software that’s used today, but also the way we think about it. But with open source, the ground beneath us has opened up. And, if we’re going to deliver on point #1, we’re going to need to start thinking about setting standards for how we build software alongside each other. Living in the wild-west might be fun for a while but it will quickly grow old and difficult to produce anything of lasting value. 3. Ignoring the hype and focusing on what matters to you Open source has made things chaotic - there’s simply a lot of software out there. Even within a very specific area, there’s a range of potential solutions to a single problem. It’s important to remember that the most hyped or popular tool isn’t necessarily going to be the one for you. Focus on the problem you’re trying to solve and what solution is going to be most effective. 4. Dedicating time to personal development While we shouldn’t be distracted by the fluctuations of the tech landscape, it’s important to nevertheless take steps to try and tame it. If it’s going to work for you, you still need to do some work. Identify what tools are going to be key in your job next year, or what skills you need to move your career forward and then create a plan for when you’re going to actively invest time in learning those tools and skills. 5. Security This may be obvious, but 2018 needs to be the year that everyone gets serious about security. High-profile hacks are only leading to more and more confusion around digital media. And on a slightly different but related note, with governments spending more time interested in the internet habits of its citizens and ensuring the internet is ‘safe’ it’s going to take a lot of commitment on the part of the tech world to challenge lazy established thinking and to make sure individuals are secure - whether that’s from criminals or otherwise...
Read more
  • 0
  • 0
  • 1730

article-image-what-i-want-hardware-2015
Ed Bowkett
18 Mar 2015
5 min read
Save for later

What I want to happen in Hardware - 2015

Ed Bowkett
18 Mar 2015
5 min read
Apple Watch Set for a tentative release in Spring 2015, the Apple Watch is being termed as ‘the wearable to watch out for in 2015’. Whilst I can certainly see why, it’s a product made by Apple, for £300 it’s not a cheap wearable. Nonetheless, we should expect the same high quality as with all other Apple products. The cost is the only thing putting me off, as well as tying me down into an iphone contract, but apart from that, Apple Watch is set to be one of the most highly sought after piece of hardware for 2015. Wearables Wearables isn’t going to go away. It continued its march in 2014, the market got swamped by similar products that essentially all did the same thing, just under a different name and varying prices. From the conclusion of CES 2015, this is set to continue. Don’t get me wrong, if these wearables aid in improving your health and wellbeing, then I am all for them. However, when there are some ‘advances’ in technology such as the self-tying shoe, then you have to question how far, or rather, how ridiculous it is getting. However, 2015 is set to continue with the wearables. I’m not anti-wearables, I have a Fitbit, I’ve considered getting other wearables, I just question when it becomes too much. Call it ‘wearable weariness’. I’m also aware I will likely be one of the minority. Virtual Reality 2014 also sparked quite possibly the biggest acquisition in hardware, the purchase of Oculus Rift by Facebook for a reported $2billion, which shows a strong commitment from Facebook towards the future development of Virtual Reality. Whilst 2014 was all about the headsets, for example Project Morpheus from Sony, Samsung’s own VR Headset. Basically Virtual Reality is set to stay. I’m expecting huge things from this area in 2015, however I’m not expecting perfection if that makes sense. Virtual Reality for games will take a leap forward and for gamers and hobbyists alike it will continue to fascinate, but it will still be in its infantile state. Whilst I love the fact that all these companies are finally becoming aware of the popularity of Virtual Reality, they do also need to work together to ensure VR becomes a thing rather than an ambition. As such, I feel that whilst I would love Virtual Reality to be a reality in 2015, I think that it would be a push for this to happen. Of course, I don’t know why I’d love it to happen, considering I suffer from motion sickness, but I can dream I guess. Steam Machine A little over a year ago, Steam Machines were announced. These were set to be ‘living room’ pcs from Valve. A year later and we’re still waiting. But are we? I mean a ‘living room’ PC is basically just a computer right? That happens to be in your living room? With the delay, obviously Valve’s partners have gone ahead and published their own ‘version’ of Steam machines. Alienware have released the Alienware Alpha. I’m expecting further partners to release their version and I’m expecting customisation to be heavily marketed. I hope that Valve also announces/innovates their own machine and encourages cheap, but game ready computers. That’s the future, and that’s what I want for 2015. Internet of Things Finally, the Internet of Things. In another blog I referred to this as this should really be classed as the Internet of Everything. This is what it’s becoming. Technology has permeated everything. The amount of devices I can now access with wifi/bluetooth and make me things is extraordinary. For example, there is a coffee machine, where if I send a text, it brews a coffee for me. I mean, that’s great, but is that really needed? Or am I just being a grumpy grouch? Similarly there are devices which monitors my mood (I don’t know how) and can then translate that in terms of the lighting. Again, do I need that? I’m viewing this area both with cautious but also with grudging respect. Of course I need a text making coffee machine. Of course I need a device that will track my sleep patterns and tell me what I need to do to improve it. Of course I need that wristband that tells me I don’t do enough exercise. I need all the nagging machines to tell me that! On a serious note, technology reaching all walks of life can only be a good thing, but we need to ensure that our functionality as a human being isn’t eroded. Sounds scarily like Terminator I know, but I’d like to make myself a good cup of coffee once in a while. Maybe once a month? I’m sure there are other hardware moments to watch out for in 2015, including the evolution of ARM boards and the ever decreasing size of mobile phones. So what are you anticipating?
Read more
  • 0
  • 0
  • 1459

article-image-what-should-you-learn-and-why-0
Oli Huggins
01 Sep 2016
5 min read
Save for later

What Should You Learn… And Why

Oli Huggins
01 Sep 2016
5 min read
When it comes to the languages, frameworks, and skills that are available to us what are the reasons we choose the ones we do? One of the key things that Skill Up 2016 looked to uncover was what motivated developers and IT professionals to pick up new tools. The results were interesting, and if you haven’t looked at them yet, check out Pages 23-25 of our Skill Up Report here. The key reasons people either picked up a new piece of software or decided to drop them came down to 4 simple points: What software offered the speed needed for the job? Which software did other programmers recommend? What was the most popular tech currently around? And finally, which piece of software was the best tool for the job in the end? Why would you want to learn a new piece of software for these reasons though? Let’s have have a look at each reason in more detail! The Need For Speed – Docker Who doesn’t like a fast application? How many times have you been waiting for a web page to load, annoyed that it’s taking so long? When it comes to customer needs speed is right up there with usability, and as a developer you have to walk the fine line without sacrificing either for the other. No wonder Docker has become the go-to option for developers looking to get as much speed as they can without having to optimize every single part of their applications. Docker is designed with being lean to the nth degree, it loads in a millisecond most of the time. Who doesn’t want that sort of speed in their applications? Docker’s widespread adoption for its speed is proof enough that developers have seen its potential in the wide world, and if you haven’t jump on it yet then it’s definitely something you should start looking into – faster applications mean that everything is better and everyone is happier after all. The One Everyone Recommends – Swift At the end of the day, what do you trust more? The unproven language or software, or the one that everyone is running up to you, trying to convince you to give it a try and let it change your life? While we all like to try new things in the end, the real winner will be the one that everyone else has already tried, loved, and is now attempting to convince you to take that plunge too so you can join them in the revolution they’re currently going through. Apple is pushing Swift hard. Swift users are pushing Swift even harder. Improving on the key flaws seen after years of C++ development and creating a language designed from the ground up to revolutionize the world of iOS development Swift has created a fanbase that swears by it. People who are trying Swift for the first time constantly understand that it’s just as good as they were told, so why not give it a try for yourself if you haven’t already? After all, could tens of thousands of developers be wrong? The Popular One – React.js No-one wants to be using tired old “uncool” software. Tech is the business of innovation, and when a big company like Facebook releases a new piece of software to the wide world people are going to turn their heads, take notice, and immediately get down the complete ins and outs of it. Adding Facebook’s secret weapon to your toolkit makes you so much more of an appealing developer to companies when the time comes. React has brought one of the biggest buzzes to the JavaScript world since the original Angular was released. Bringing many of the latest changes in web design like a component based approach to JavaScript without sacrificing usability and ease of writing it’s pretty obvious why React has taken the world by storm – and not showing any signs of slowing down either! The Best Tool For The Job – AWS Probably the easiest reason to start picking up a new skill; which one of us hasn’t wanted to enter a job or task without the best tool for the job? Having the best of the best available to us means that the job is so much easier in the long run – there is less chance of issues popping up during and after development and we can have piece of mind that our customers will be having the best possible product we can provide without much cost to us in stress or unforeseen circumstances. AWS is tried and tested when it comes down to providing everything that developers are looking for in a platform, it itself holds dozens of different tools for any situation and when it comes to price as well developers say it’s one of the most competitive around; with Amazon handling the security side of things too it’s obvious that for those who swear by it it’s the development platform that should be your first choice when looking into a cloud based solution. Are you looking to dig deeper into these pieces of tech, maybe even to start learning a new one? Have a look at our 5 for $50 bundles and get deep into getting the most you can get from them now, or sign up to our subscription service Mapt to get all of these and more at $29.99 per month! 
Read more
  • 0
  • 0
  • 1367
Modal Close icon
Modal Close icon