Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Author Posts - Data

37 Articles
article-image-what-should-we-watch-tonight-ask-a-robot-says-matt-jones-from-ovo-mobile
Neil Aitken
18 Aug 2018
11 min read
Save for later

What Should We Watch Tonight? Ask a Robot, says Matt Jones from OVO Mobile [Interview]

Neil Aitken
18 Aug 2018
11 min read
Netflix, the global poster child for streamed TV and the use of Big Data to inform the programs they develop, has shown steady customer growth for several years now. Recently, the company revealed that it would be shutting down the user reviews which have been so prominent in their media catalogue interface for so long. In the background, media and telco are merging. AT&T, the telco which undertook the biggest deal in history recently, acquired Time and wants HBO to become like Netflix. Telia, a Finnish telecommunications company bought Bonnier Broadcasting in late July 2018. The video content landscape has changed a great deal in the last decade. Everyone in the entertainment game wants to move beyond broadcast TV and to use data to develop content their users will love and which will give their customer base more variety. This means they can look to data to charge higher subscription rates per user, experiment with tiered subscriptions, decide to localize global content, globalize local content and more. These changes raise two key questions. First, are we heading for a world in which AI and ML based algorithms drive what we watch on TV? And second, are the days of human recommendation being quietly replaced by machine recommendations over which the user has no control? [caption id="attachment_21726" align="aligncenter" width="1392"] As you know, Netflix is acquiring customers fast.[/caption] Source: Statista To get an insider’s view on the answer to those questions, I sat down with Matt Jones of OVO Mobile, one of Australia’s fastest growing telecommunications companies. OVO offer their customers a unique point of difference – streaming video sports content, included in a phone plan. OVO has bought the rights to a number of niche sports in Australia which weren’t previously available and now offer free OTA (Over the Air) digital content for fans of ‘unusual’ sports like Drag Racing or Gymnastics. OTA content is anything delivered to a user’s phone over a wireless network. In OVO’s case, the data used to transport the video content they provide to their users is free. That means customers don’t have to worry about paying more for mobile data so they can watch it – a key concern for users. OVO Mobile and Netflix are in very similar businesses – and Matt has a unique point of view about how Artificial Intelligence and Machine Learning will impact the world of telco and media. Key takeaways What’s changed our media consumption habits: the ubiquitous mobile internet, the always on and connected younger generation, better mobile hardware, improved network performance and capabilities, need for control over content choices. Digitization allows new features –some of which that people have proven to love - binge watching, screening out advert breaks and time shifting. The key to understanding the value of ML and AI is not in understanding the statistical or technical models that are used to enable it, it’s the way AI is used to improve the customer experience your digital customers are having with you. The use of AI in digital/app experience has changed in a way to personalize what users can see which old media could not offer. Content producers use the information they have on us, about the programs we watch, when we watch them and for how long we watch to Contribution of AI / ML towards the delivery of online media is endless in terms of personalisation, context awareness, notification management etc. Social acceptance of media delivered to users on mobile phones is what’s driving change A number of overlapping factors are driving changes in how we engage with content. Social acceptance of the internet and mobile access to it as a core part of life is one key enabler. From a technology perspective, things have changed too. Smartphones now have bigger, higher resolution screens than ever before – and they’re with us all the time. Jones believes this change is part of a cultural evolution in how we relate to technology. He says, “There has also been a generational shift which has taken place. Younger people are used to the small screen being the primary device. They’re all about control, seeking out their interests and consuming these, as opposed to previous generations which was used to mass content distribution from traditional channels like TV.” Other factors include network performance and capability which has improved dramatically in recent years. Data speeds have grown exponentially from 3G networks – launched less than 15 years ago, which could support stuttered low resolution video to 4G and 4.5G enabled networks. These can now support live streaming of High Definition TV. Mobile data allowances in plans and offers from some phone companies to provide some content ‘data free’ (as OVO does with theirs) have also driven uptake. Finally, people want convenience and digital offers that in a way people have never experienced before. Digitization allows new features –some of which that people have proven to love - binge watching, screening out advert breaks and time shifting. What part can AI / machine learning play in the delivery of media online? Artificial Intelligence (AI) is already part of 85% of our online interactions. Gartner suggest, it will be part of every product in the future. The key to understanding the value of ML and AI is not in understanding the statistical or technical models that are used to enable it, it’s the way AI is used to improve the customer experience your digital customers are having with you. When you find a new band in Spotify, when YouTube recommends a funny video you’ll like, when Amazon show you other products that you might like to consider alongside the one you just put in to your basket, that’s AI working to improve your experience. “Over The Top content is exploding. Content owners are going direct to consumer and providing fantastic experiences for their users. What’s changing is the use of AI in digital / app experiences to personalize what users see in ways old media never could.” Says Matt. Matt’s video content recommendation app, for example, ‘learns’ not just what you like to watch but also the times you are most likely to watch it. It then prompts users with a short video to entice them to watch. And the analytics available show just how effective it is. Matt’s app can be up to 5 times more successful at encouraging customers to watch his content, than those who don’t use it. “The list of ways that AI / ML contributes to the delivery of media online is endless. Personalisation, context awareness, notification management …. Endless” By offering users recommendations on content they’ll love, producers can now engage more customers for longer. Content producers use the information they have on us, about the programs we watch, when we watch them and for how long we watch to: Personalise at volume: Apps used to deliver content can personalise what’s shown first to users, based on a number of variables known about them, including the sort of context awareness that can be relatively easy to find on mobile devices. Ultimately, every AI customer experience improvement (including the examples that follow) are all designed to automate the process of providing something special to each individual that they uniquely want. Automation means that can be done at scale, with every customer treated uniquely. Notification management: AI that tracks the success of notifications and acknowledges, critically, when they are not helpful to the user, can be employed to alert users only about things they want to know. These AI solutions provide updates to users based on their preferences and avoid the provision of irrelevant information. Content discovery & Re- engagement: AI and ML can be used in the provision of recommendations as to what users could watch, which expose customers to content they would not otherwise find, but which they are likely to value. Better / more relevant advertising: Advertising which targets a legitimately held, real, customer need is actually useful to viewers. Better analytics for AI can assist in targeting micro segments with ads which contain information customers will value. Lattice, is a Business Insights tool provider. Their ‘Lattice Engine’ product combined information held in multiple cloud based locations and uses AI to automatically assign customers to a segment which suits them. Those data are then provided to a customer’s eCommerce site and other channel interactions, and used to offer content which will help them convert better. Developing better segments: Raw data on real customers can be gathered from digital content systems to inform Above The Line marketing in the real, non digital world. Big data analytics can now be used with accurate segmentation for local area marketing and to tie together digital and retail customer experiences. McKinsey suggest that 36% of companies are actively pursuing strategies, driven from their Big Data reserves. They advise their clients that Big Data can be used to better understand and grow Customer Lifetime Values. In the future - Deep linking for calls-to-action: Some digital content is provided in a form such that viewers can find out more information about an item on screen. Providing a way to deep link from a video screen in to a shopping cart prepopulated with something just seen on screen is an exciting possibility for the future. Cutting steps out of the buying process to make it easier for eCommerce users to transact from within content apps to buying a product they’ve seen on the screen is likely to become a big business. Deep linking raises the value of the content shown to the degree it raise the sales of the products included. Bringing it all together Jones believes those that invest big in AI and machine learning, and of them, those who find a way to draw out insights and act upon them, will be the ultimate victors. “The big winners are going to be the people who connect a fan with content they love and use AI and ML to deliver the best possible experience. It’s about using all the information you have about your users and acting on them.” Said Jones. That commercial incentive is already driving behavior. AI and ML drive already provide personalized content recommendations. Progressive content companies, including Matt’s, are already working on building AI in to every facet of every Digital experience you have. As to whether AI is entirely replacing social media influence, I don’t think that’s the case. The research says people are still 4 times more likely to watch a video if it is recommended to them by a friend. Reviews have always been important to presales on the internet and that applies to TV shows, too. People want to know what real users felt when they used a product. If they can’t get reviews from Netflix, they will simply open a new tab and google for reviews in that while they are thinking of how to find something to watch on Netflix. About Matt Jones, Matt is an industry disruptor, launching the first of its kind Media and Telco brand OVO Mobile in 2015, Matt is the driving force behind convergence of new media & telco – by bringing together Telecommunications with Media Rights and digital broadcast for mass distribution. OVO is a new type of Telco, delivering content that fans are passionate about, streamed live on their mobile or tablet UNLIMITED & data free. OVO has secured exclusive 3 year+ digital broadcast and distribution rights for a range of content owners including Supercars, World Superbikes, 400 Thunder Drag Series, Audi Australia Racing & Gymnastics Australia – with a combined Australian audience estimated at over 7 Million. OVO is a multi-award winner, including winning the Money Magazine Best of the Best Award 2017 for high usage, as well as featuring on A Current Affair, Sunrise, The Today Show, Channel 7 News, Channel 9 News and multiple radio shows for their world-first kids’ mobile phone plan with built-in cyber security protection. As OVO CEO, Matt was nominated for Start-Up Executive of the Year at the CEO Magazine Awards 2017 and was awarded runner-up. The Award recognises the achievements of leaders and professionals, and the contributions they have made to their companies across industry-specific categories. Matt holds a Bachelor of Arts (BA) from the University of Tasmania and regularly speaks at Telco, Sports Marketing and Media forums and events. Matt has held executive leadership roles at leading Telecommunications brands including Telstra (Head of Strategy – Operations), Optus, Vodafone, AAPT, Telecom New Zealand as well as global Management Consulting firms including BearingPoint. Matt lives on the northern beaches of Sydney with his wife Mel and daughters Charlotte and Lucy. How to earn $1m per year? Hint: Learn machine learning We must change how we think about AI, urge AI founding fathers Alarming ways governments are using surveillance tech to watch you
Read more
  • 0
  • 0
  • 16787

article-image-sports-analytics-empowering-better-decision-making
Amey Varangaonkar
14 Nov 2017
11 min read
Save for later

Expert Insights: How sports analytics is empowering better decision-making

Amey Varangaonkar
14 Nov 2017
11 min read
Analytics is slowly changing the face of the sports industry as we know it. Data-driven insights are being used to improve the team and individual performance, and to get that all-important edge over the competition. But what exactly is sports analytics? And how is it being used? What better way to get answers to these questions than asking an expert himself! [author title="Gaurav Sundararaman"]A Senior Stats Analyst at ESPN currently based in Bangalore, India. With over 10 years of experience in the field of analytics, Gaurav worked as a Research Analyst and a consultant in the initial phase of his career. He then ventured into sports analytics in 2012, and played a major role in the Analytics division of SportsMechanics India Pvt. Ltd. where he was the Analytics Consultant for the T20 World Cup winning West Indies team in 2016.[/author]   In this interview, Gaurav takes us through the current landscape of sports analytics, and talks about how analytics is empowering better decision-making in sports. Key Takeaways Sports analytics pertains to finding actionable, useful insights from sports data which teams can use to gain competitive advantage over the opposition Instincts backed by data make on and off-field decisions more powerful and accurate Rise of IoT and wearable technology has boosted sports analytics. With more data available for analysis, insights can be unique and very helpful Analytics is being used in sports right from improving player performance to optimizing ticket prices and understanding fan sentiments Knowledge of  tools for data collection, analysis and visualization such as R, Python and Tableau is essential for a sports analyst Thorough understanding of the sport, up to date skillset and strong communication with players and management are equally important factors to perform efficient analytics Adoption of analytics within sports has been slow, but steady. More and more teams are now realizing the benefits of sports analytics and are adopting an analytics-based strategy Complete Interview Analytics today is finding widespread applications in almost every industry today - how has the sports industry changed over the years? What role is analytics playing in this transformation? The sports industry has been relatively late in adopting analytics. That said, the use of analytics in sports has also varied geographically. In the west, analytics plays a big role in helping teams, as well as individual athletes, take up decisions. Better infrastructure and a quick adoption of the latest trends in technology is an important factor here. Also, investment in sports starts from a very young age in the west, which also makes a huge difference.  In contrast, many countries in Asia are still lagging behind when it comes to adopting analytics, and still leverage on traditional techniques to solve problems. A combination of analytics with traditional knowledge from experience would go a long way in helping teams, players and businesses succeed. Previously the sports industry was a very close community. Now with the advent of analytics, the industry has managed to expand its horizon. We witness more non-sportsmen playing a major part in the decision making. They understand the dynamics of the sports business and how to use data-driven insights to influence the same. Many major teams across different sports such as Football (Soccer), Cricket, American Football, Basketball and more have realized the value of data and analytics. How are they using it? What advantages does analytics offer to them? One thing I firmly believe is that analytics can’t replace skills or can’t guarantee wins. What it can do is ensure there is logic towards certain plans and decisions. Instincts backed by data make the decisions more powerful. I always tell the coaches or players – Go with your gut and instincts as Plan A. If it does not work out your fall back could be Plan B based on trends and patterns derived from data. It turns out to be a win-win for both. Analytics offers a neutral perspective which sometimes players or coaches may not realize. Each sport has a unique way of applying analytics to make decisions and obviously, as analysts, we need to understand the context and map the relevant data. As far as using the analytics is concerned, the goals are pretty straightforward - be the best, beat the opponents and aim for sustained success. Analytics helps you achieve each of these objectives. The rise of IoT and wearable technology over the last few years has been incredible. How has it affected sports, and sports analytics, in particular? It is great to see that many companies are investing in such technologies. It is important to identify where wearables and IoT can be used in sport and where it can cause maximum impact. These devices allow in-game monitoring of players, their performance, and their current physical state. Also, I believe more than on-field, these technologies would be very useful in engaging fans as well. Data derived from these devices could be used in broadcasting as well as providing a good experience for fans in the stadiums. This will encourage more and more people to watch games in stadiums and not in the comfort of their homes. We have already seen a beginning with a few stadiums around the world leveraging technology (IoT). The Mercedes Benz stadium (home of Atlanta Falcons) has a high tech stadium powered by IBM. Sacramento is building a state-of-the-art facility for the Sacramento Kings. This is just the start, and it will only get better with time. How does one become a sports analyst? Are there any particular courses/certifications that one needs to complete in order to become one? Can you share with us your journey in sports analytics? To be honest there are no professional courses yet in India to become an Analyst. There are a couple of colleges which have just started offering Sports Analytics as a course in their Post-Graduation Program. However, there are a few companies (Sports Mechanics and Kadamba Technologies in Chennai) that offer jobs that can enable you to become a Sports Analyst if you are really good.  If you are a freelancer then my advice would be to ensure you brand yourself well and showcase your knowledge through social media platforms and get a breakthrough via contacts. Post my MBA, Sports Mechanics (a leader in this space), a company based in Chennai were looking for someone to work to start their data practice. I was just lucky to be at the right place at the right time. I worked for 4 years there and was able to learn a lot about the industry and what works and what does not. Being a small company, I was lucky to don multiple hats and work on different projects across the value chain. I moved and joined the lovely team Of ESPNCricinfo where I work for their stats team. What are the tools and frameworks that you use for your day to day tasks? How do they make your work easier? There are no specific tools or frameworks. It depends on the enterprise you are working for. Usually, they are proprietary tools of the company. Most of these tools are used either to collect, mine or visualize data. Interpreting the information and presenting it in a manner in which users understand is important and that is where certain applications or frameworks are used. However to be ready for the future it would be good to be skilled on tools that support data collection, analysis and visualization namely R, Python and Tableau, to name a few. Do sports analysts have to interact with players and the coaching staff directly? How do you communicate your insights and findings with the relevant stakeholders? Yes, they have to interact with players and management directly. If not, the impact will be minimal. Communicating insights is very important in this industry. Too much analysis could lead to paralysis. We need to identify what exactly each player or coach is looking for, based on their game and try to provide them the information in a crisp manner which helps them make decisions on and off the field. For each stakeholder the magnitude of the information provided is different. For the coach and management, the insights can be in detail while for the players we need to keep it short and to the point. The insights you generate must not only be limited to enhancing the performance of a team on the field but much more than that. Could you give us some examples? Insights can vary. For the management, it could deal with how to maximise the revenue or save some money in an auction. For coaches, it could help them know about his team’s as well as the opposition’s strengths and weaknesses from a different perspective. For captains, data could help in identifying some key strategies on the field. For example, in Cricket, it could help the captain determine which bowler to bring on to which opposition batsmen, or where to place the fielders. Off the field, one area where analytics could play a big role would be in grassroots development and tracking of an athlete from an early age to ensure he is prepared for the biggest stage. Monitoring performance, improving physical attributes by following a specific regimen, assessing injury record and designing specific training programs, etc. are some ways in which this could be done. What are some of the other challenges that you face in your day to day work? Growth in this industry can be slow sometimes. You need to be very patient, work hard and ensure you follow the sport very closely. There are not many analytical obstacles as such, but understanding the requirements and what exactly the data needs are can be quite a challenge. Despite all the buzz, there are quite a few sports teams and organizations who are still reluctant to adopt an analytics-based strategy – why do you think is that the case? What needs to change? The reason for the slow adoption could be the lack of successful case studies and the awareness. In most sports when so many decisions are taken on the field sometimes the players' ability and skill seems far more superior to anything else. As more instances of successful execution of data-based trends come up, we are likely to see more teams adopting the data-based strategy. Like I mentioned earlier, analytics needs to be used to make the coach and captain take the most logical and informed decisions. Decision-makers need to be aware of the way it is used and how much impact it can cause.  This awareness is vital towards increasing the adoption of analytics in sports. Where do you see sports analytics in the next 5-10 years? Today in sports many decisions are taken on gut feeling, and I believe there should be a balance. That is where analytics can help. In sports like Cricket, only around 30% of the data is used and there is more emphasis given to video. Meanwhile, if we look at Soccer or Basketball, the usage of data and video analytics is close to 60-70% of its potential. Through awareness and trying out new plans based on data, we can increase usage of analytics in cricket to 60-70 % in the next few years. Despite the current shortcomings, It is fair to say that there is a progressive and positive change at the grassroots level across the world. Data-based coaching and access to technology are slowly being made available to teams as well as budding sportsmen/women. Another positive is that the investment in the sports industry is growing steadily. I am confident that in a couple of years, we will see more job opportunities in sports. Maybe in five years, the entire ecosystem would be more structured and professional. We would witness analytics playing a much bigger role in helping stakeholders make informed decisions, as data-based insights become even more crucial. Lastly, what advice do you have for aspiring sports analysts? My only advice would be - Be passionate, build a strong network of people around you, and constantly be on the lookout for opportunities. Also, it is important to keep updating your skill-set in terms of the tools and techniques needed to perform efficient and faster analytics. Newer and better tools keep coming up very quickly, which make your work easier and faster. Be on the lookout for such tools! One also needs to identify their own niche based on their strengths and try to build on that. The industry is on the cusp of growth and as budding analysts, we need to be prepared to take off when the industry matures. Build your brand and talk to more people in the industry - figure out what you want to do to keep yourself in the best position to grow with the industry.
Read more
  • 0
  • 0
  • 14679

article-image-discussing-sap-past-present-and-future-with-rehan-zaidi-senior-sap-abap-consultant-interview
Savia Lobo
04 Oct 2018
11 min read
Save for later

Discussing SAP: Past, present and future with Rehan Zaidi, senior SAP ABAP consultant [Interview]

Savia Lobo
04 Oct 2018
11 min read
SAP, the market-leading enterprise software, recently became the first European technology company to create an AI ethics advisory panel where they announced seven guiding principles for AI development. These guidelines revolve around recognizing AI’s significant impact on people and society. Also, last week, at the Microsoft Ignite conference, SAP, in collaboration with Microsoft and Adobe announced the Open Data Initiative. This initiative aims to help companies to better govern their data and support privacy and security initiatives. For SAP, this initiative will further bring advancements to its SAP C/4HANA and S/4HANA platforms. All of these actions emphasize SAP’s focus on transforming itself into a responsible data use company. We recently interviewed Rehan Zaidi, a senior SAP ABAP consultant. Rehan became one of the youngest authors on SAP worldwide when he was published in the prestigious SAP Professional Journal in the year 2001. He has written a number of books, and over 20 articles and professional papers for the SAP Professional Journal USA and HR Expert USA, part of the prestigious sapexperts.com library. Following are some of his views on the SAP community and products and how the SAP suite can benefit people including budding professionals, developers, and business professionals. Key takeaways SAP HANA was introduced to accelerate jobs 200 times faster while maintaining the efficiency. The introduction of SAP Leonardo brought in the next wave of AI, machine learning, and blockchain services via the SAP cloud platform and other standalone projects. Experienced ABAP developers should look forward to getting certified in one of the newest technologies such as HANA, and Fiori. SAP ERP Central Component (SAP ECC) is the on-premises version of SAP, and it is usually implemented in medium and large-sized companies. For smaller companies, SAP offers its Business One ERP platform. SAP Fiori is a line of SAP apps meant to address criticisms of SAP's user experience and UI complexity. Q.1. SAP is one of the most widely used ERP tools. How has it evolved over the past few years from the traditional on-premise model to keep up with the cloud-based trends? Yes. Let me cover the main points. SAP started in 1973 as a company and the first product SAP R/98 was launched. In 1979, SAP launched the R/2 design. It had most of the typical processes such as accounting, manufacturing processes, supply chain logistics, and human resources. Then came R/3  that brought the more efficient three-tier (Application server -  Database and the presentation (GUI)) architecture, with more new modules and functionalities added. It was a smart system fully configurable by functional consultants. This was further enhanced with Netweaver capability that brought the integration of the internet and SOA capability.  SAP introduced the ECC 5 and subsequently the ECC 6 Release. Mobility was later added that lets mobile applications running on devices to access the business processes in SAP and execute them. Both display and updation of SAP data was possible. HANA system was then introduced. It is very fast and efficient - allows you to do 200 times faster jobs than before Cloud systems then became available that let customers connect to SAP Cloud Platform via their on-premise systems and then get access to services such as Mobile Service for app protection, Mobile Service for SAP Fiori, among others. SAP Leonardo was finally introduced, as a way of bringing in next-gen AI, machine learning and blockchain services via standalone projects and the SAP cloud platform. Q.2. Being a Senior ABAP Programming Analyst, how does your typical day look like? Ahh. Well, a typical day! No two days are the same for us. Each morning we find ourselves confronting a problem whose solution is to be devised. A different problem every day- followed by a unique solution. We spend hours and hours finding issues in custom developed programs. We learn about making custom programs run faster. We get requirements of a wide variety of users. They may be in the Human Resource, Materials Management, Sales and Distribution or Finance, and so on. This requirement may be pertaining to an entirely new report or a dialog program having a set of screens. We even do Fiori ( using Javascript based library) applications that may be accessible from the PC or a mobile device. I even get requirements of teaching junior or trainee SAP developers on a wide variety of technologies. Q.3. Can you tell us about the learning curve for SAP? There are different job profiles related to SAP which range from executives to consultants and managers. How do each of them learn or update themselves on SAP? Yes, this is a very important question. A simple answer to this question is that “there is no end to learning and at any stage, learning is never enough,” no matter to which field within SAP you belong to. Things are constantly changing. The more you read and the more you work, you feel that there is a lot to be done. You need to constantly update yourself and learn about new technologies. There is plenty of material available on the internet. I usually refer to the Official SAP website for newer courses available. They even tell you for which background (managers, developers) the courses are relevant to. I also go to open.sap.com for new courses. Whether they are consultants (functional and technical), or managers, all of them need to keep themselves up-to-date. They must take new courses and learn about innovation in their technology. For example, HR must now study and try to learn about Successfactors. Even integration of SAP HANA with other software might be an interesting topic of today. There are Fiori and HANA related courses for Basis consultants and the corresponding tracks for developers. Some knowhow of newer technologies is also important for managers and executives, since your decisions may need to be adapted based on the underlying technologies running in your systems. You should know the pros and cons of all technologies in order to make the correct move for your business. Q.4. Many believe an SAP certification improves their chances of getting jobs at competitive salaries. How important are certifications? Which SAP certifications should a buddying developer look forward to obtain? When I did my Certification in October 2000, I used to think that Certifications are not important. But now I have realized, yes, it makes a difference.  Well, certifications are definitely a plus point. They enhance your CV and allow you to have an edge over those who are not certified.  I found some jobs adverts that specifically mention that certification will be required or will be advantageous. However, they are only useful when you have at least 4 years of experience. For a fresh graduate, a certification might not be very useful. A useful SAP consultant/developer is a combination of solid base/foundation of knowledge along with a touch of experience. I suggest all my juniors to go for Certifications in order to strengthen concepts, which include: C_C4C30_1711 - SAP Certified Development Associate – SAP Hybris Cloud for Customer C_CP_11 - SAP Certified Development Associate - SAP Cloud Platform C_FIORDEV_20 - SAP Certified Development Associate - SAP Fiori Application Developer C_HANADEV_13 - SAP Certified Development Associate - SAP HANA C_SMPNHB_30 - SAP Certified Development Associate - SAP Mobile Platform Application Development (SMP 3.0) C_TAW12_750 - SAP Certified Development Associate - ABAP with SAP NetWeaver 7.50 E_HANAAW_12 - SAP Certified Development Specialist - ABAP for SAP HANA For experienced ABAP developers, I suggest getting certified on the newest technologies such as HANA, and Fiori. They may help you get a project quicker and/or at a better rate than others. Q.5. The present buzz is around AI, machine learning, IoT, Big data, and many other emerging technologies. SAP Leonardo works on making it easy to create frameworks for harnessing the latest tech. What are your thoughts on SAP Leonardo? Leonardo is SAP’s response to an AI platform. It should be an important part of SAP’s offerings, mostly built on the SAP cloud platform. SAP has relaunched Leonardo as a digital innovation system. As I understand it, Leonardo allows customers to take advantage of artificial intelligence (AI), machine learning, advanced analytics and blockchain on their company’s data. SAP gives customers an efficient way of using these technologies to solve business issues. It allows you to build a system which, in conjunction with machine learning, searches for results that can be combined with SAP transactions. The benefit with SAP Leonardo is that all the company’s data is available right in the SAP system. Using Leonardo, you have access to all human resources data and any other module data residing in the ERP system. Any company from any industry can make use of Leonardo; it works equally well for retailers, food and beverage companies and medical industries, for organizations working in retail, manufacturing and automotive. An approach that works for one company in a given industry can be applied to other companies in that industry. Suppose a company operates sensors. They can link the sensor data with the data in their SAP systems and even link that with other data, and they can then use the Leonardo capabilities to solve problems or optimize performance. When a problem for one company in an industry is solved, a similar solution may be applied to the entire industry. Yes, in my opinion, Leonardo has a bright future and should be successful. For more information about Leonardo success stories, I encourage readers to check out SAP Leonardo Internet of Things Portfolio & Success Stories. Q. 6. You are currently writing a book on ABAP Objects and Design Patterns expected to be published by the end of 2018. What was your motivation behind writing it? Can you tell us more about ABAP objects? What should readers expect from this book? ABAP and ABAP Objects has gone tremendous changes since some time both on the features (and capability) as well as the syntax. It is the most unsung topic of today. It has been there for quite long but most developers are not aware of it or are not comfortable enough to use them in their day to day work. ABAP is a vast community with developers working in a variety of functional areas. The concepts covered in the book will be generic, allowing the learner to apply them to his or her particular area. This book will cover ABAP objects (the object-oriented extension of the SAP language ABAP) in the latest release of SAP NetWeaver 7.5 and explain the newest advancements. It will start with the programming of objects in general and the basics of ABAP language the developer needs to know to get started. The book will cover the most important topics needed on everyday support jobs and for succeeding in projects. The book will be goal-directed, not a collection of theoretical topics. It won’t just touch on the surface of ABAP objects, but will go in depth from building the basic foundation (e.g., classes and objects created locally and globally) to the intermediary areas (e.g., ALV programming, method chaining, polymorphism, simple and nested interfaces), and then finally into the advanced topics (e.g., shared memory, persistent Objects). The best practices for making better programs via ABAP objects will be shown at the end. No long stories, no boring theory, only pure technical concepts followed by simple examples using coding pertaining to football players. Everything will be presented in a clear, interesting manner, and readers will learn tips and tricks they can apply immediately. Learners, students, new SAP programmers and SAP developers with some experience can use this as an alternative to expensive training books. The book will also save reader’s time searching the internet for help writing new programs. Knowing ABAP objects is key for ABAP developers these days to move forward. Starting from simple ALV reporting requirements, or defining and catching exceptional situations that may occur in a program or even the enhancement technology of BAdIs that lets you enhance standard SAP applications require sound ABAP Objects understanding. In addition, Web Dynpro application development, the Business Object Processing Framework, and even OData service creation to expose data that can be used by Fiori apps all demand solid knowledge of ABAP objects. How to perform predictive forecasting in SAP Analytics Cloud Popular Data sources and models in SAP Analytics Cloud Understanding Text Search and Hierarchies in SAP HANA  
Read more
  • 0
  • 0
  • 14662

article-image-learn-ibm-spss-modeler
Amey Varangaonkar
03 Nov 2017
9 min read
Save for later

Why learn IBM SPSS Modeler in 2017

Amey Varangaonkar
03 Nov 2017
9 min read
IBM’s SPSS Modeler provides a powerful, versatile workbench that allows you to build efficient and accurate predictive models in no time. What else separates IBM SPSS Modeler from other enterprise analytics tools out there today? To know just that, we talk to arguably two of the most popular members of the SPSS community. [box type="shadow" align="" class="" width=""] Keith McCormick Keith is a career-long practitioner of predictive analytics and data science, has been engaged in statistical modeling, data mining, and mentoring others in this area for more than 20 years. He is also a consultant, an established author, and a speaker. Although his consulting work is not restricted to any one tool, his writing and speaking have made him particularly well known in the IBM SPSS Statistics and IBM SPSS Modeler communities. Jesus Salcedo Jesus is an independent statistical consultant and has been using SPSS products for over 20 years. With a Ph.D., in Psychometrics from Fordham University, he is a former SPSS Curriculum Team Lead and Senior Education Specialist, and has developed numerous SPSS learning courses and trained thousands of users.[/box] In this interview with Packt, Keith and Jesus give us more insights on the Modeler as a tool, the different functionalities it offers, and how to get the most out of it for all your data mining and analytics needs. Key Interview Takeaways IBM SPSS Modeler is easy to get started with but can be a tricky tool to master Knowing your business, your dataset and what algorithms you are going to apply are some key factors to consider before building your analytics solution with SPSS Modeler SPSS Modeler’s scripting language is Python, and the tool has support for running R code IBM SPSS Modeler Essentials helps you effectively learn data mining and analytics, with a focus on working with data than on coding Full Interview Predictive Analytics has garnered a lot of attention of late, and adopting an analytics-based strategy has become the norm for many businesses. Why do you think this is the case?   Jesus: I think this is happening because everyone wants to make better-informed decisions.  Additionally, predictive analytics brings the added benefit of discovering new relationships that you were previously not aware of. Keith: That’s true, but it’s even more exciting when the models are deployed and are potentially driving automated decisions. With over 40 years of combined experience in this field, you are master consultants and trainers, with an unrivaled expertise when it comes to using the IBM SPSS products. Please share with us the story of your journey in this field. Our readers would also love to know how your day-to-day schedule looks like.   Jesus: When I was in college, I had no idea what I wanted to be. I took courses in many areas, however I avoided statistics because I thought it would be a waste of time, after all, what else is there to learn other than calculating a mean and plugging it into fancy formulas (as a kid I loved baseball, so I was very familiar with how to calculate various baseball statistics). Anyway, I took my first statistics course (where I learned SPSS) since it was a requirement, and I loved it. Soon after I became a teaching assistant for more advanced statistics courses and I eventually earned my Ph.D. in Psychometrics, all the while doing statistical consulting on the side. After graduate school, my first job was as an education consultant for SPSS (where I met Keith). I worked at SPSS (and later IBM) for seven years, at first focusing on training customers on statistics and data-mining, and then later on developing course materials for our trainings. In 2013 Keith invited me to join him as an IBM partner, so we both trained customers and developed a lot of new and exciting material in both book and video formats. Currently, I work as an independent statistical and data-mining consultant and my daily projects range from analyzing data for customers, training customers so they can analyze their own data, or creating books and videos on statistics and data mining. Keith: Our careers have lots of similarities. My current day to day is similar too. Lately, about 1/3rd of my year is lecturing and curriculum development for organizations like TDWI (Transforming Data with Intelligence), The Modeling Agency, and UC Irvine Extension. The majority of my work is in predictive analytics consulting. I especially enjoy projects where I’m brought in early and can help with strategy and planning. Then, the coach and mentor take over a team until they are self-sufficient. Sometimes building the team is even more exciting than the first project because I know that they will be able to do many more projects in the future. There is a plethora of predictive analytics tools used today - for desktop and enterprises. IBM SPSS Modeler is one such tool. What advantages does SPSS Modeler have over the others, in your opinion? Keith: One of our good friends who co-authored the IBM SPSS Modeler Cookbook made an interesting comment about this at a conference. He is unique in that he has done one-day seminars using several different software tools. As you know, it is difficult to present data mining in just one day. He said that only with Modeler he is able to spend some time on each of the CRISP-DM phases of a case study in a day. I think he feels this way because it’s among the easiest options to use. We agree. While powerful, and while it takes a whole career to master everything, it is easy to get started. Are there any prerequisites for using SPSS Modeler? How steep is the learning curve in order to start using the tool effectively? Keith: Well, the first thing I want to mention is that there are no prerequisites for our PACKT video IBM SPSS Modeler Essentials. In that, we assume that you are starting from scratch. For the tool in general, there aren’t any specific requisites as such, however knowing your data, and what insights you are looking for always helps. Jesus: Once you are back at the office, in order to be successful on a data mining project or efficiently utilize the tool, you’ll need to know your business, your data, and the modeling algorithm you are using. Keith: The other question that we get all the time is how much statistics and machine learning do you have to know. Our advice is to start with one or maybe two algorithms and learn them well. Try to stick to algorithms that you know. In our PACKT course, we mostly focus on just Decision Trees, which one of the easiest to learn. What do you think are the 3 key takeaways from your course - IBM SPSS Modeler Essentials? The 3 key takeaways from this course, we feel are: Start slow. Don’t pressure yourself to learn everything all at once. There are dozens of “nodes” in Modeler. We introduce the most important ones so start there. Be brilliant in the basics. Get comfortable with the software environment. We recommend the bests ways to organize your work. Don’t rush to Modeling. Remember the Cross Industry Standard Process for Data Mining (CRISP-DM), which we cover in the video. Use it to make sure that you proceed systematically and don’t skip critical steps. IBM recently announced that SPSS Modeler would be available freely for educational usage. How can one make the most of this opportunity? Jesus: A large portion of the work that we have done over the past few years has been to train people on how to analyze data. Professors are in a unique position to expose more students to data mining since we teach only those students whose work requires this type of training, whereas professors can expose a much larger group of people to data mining. IBM offers several programs that support professors, students, and faculty; for more information visit: https://www-01.ibm.com/software/analytics/spss/academic/ Keith: When seeking out a university class, whether it be classroom or online, ask them if they use Modeler or if they allow you to complete your homework assignments in Modeler. We recognize that R based classes are very popular now, but you potentially won’t learn as much about Data Mining. Sometimes too much of the class is spent on coding so you learn R, but learn less about analytics. You want to spend most of the class time actively working with data and producing results. With the rise of open source languages such as R and Python and their applications in predictive analytics, how do you foresee enterprise tools like SPSS Modeler competing with them? Keith: Perhaps surprisingly, we don’t think Modeler does compete with R or Python. A lot of folks don’t know that Python is Modeler’s scripting language. Now, that is an advanced feature, and we don’t cover it in the Essentials video, but learning Python actually increases your knowledge of Modeler. And Modeler supports running R code right in a Modeler stream by using the R nodes. So Modeler power users (or future power users) should keep learning R on their to-do list. If you prefer not to use code, you can produce powerful results without learning either by just using Modeler straight out of the box. So, it really is all up to you. If this interview has sparked your interest in learning more about IBM SPSS Modeler, make sure you check out our video course IBM SPSS Modeler Essentials right away!
Read more
  • 0
  • 0
  • 14327

article-image-machine-learning-can-useful-almost-every-problem-domain-interview-sebastian-raschka
Packt Editorial Staff
04 Sep 2017
9 min read
Save for later

Has Machine Learning become more accessible?

Packt Editorial Staff
04 Sep 2017
9 min read
Sebastian Raschka is a machine learning expert. He is currently a researcher at Michigan State University, where he is working on computational biology. But he is also the author of Python Machine Learning, the most popular book ever published by Packt. It's a book that has helped to define the field, breaking it out of the purely theoretical and showing readers how machine learning algorithms can be applied to everyday problems. Python Machine Learning was published in 2015, but Sebastian is back with a brand new edition, updated and improved for 2017, working alongside his colleague Vahid Mirjalili. We were lucky enough to catch Sebastian in between his research and working on the new edition to ask him a few questions about what's new in the second edition of Python Machine Learning, and to get his assessment of what the key challenges and opportunities in data science are today. What's the most interesting takeaway from your book? Sebastian Raschka: In my opinion, the key take away from my book is that machine learning can be useful in almost every problem domain. I cover a lot of different subfields of machine learning in my book: classification, regression analysis, clustering, feature extraction, dimensionality reduction, and so forth. By providing hands-on examples for each one of those topics, my hope is that people can find inspiration for applying these fundamental techniques to drive their research or industrial applications. Also, by using well-developed and maintained open source software, makes machine learning very accessible to a broad audience of experienced programmers as well as people who are new to programming. And introducing the basic mathematics behind machine learning, we can appreciate machine learning being more than just black box algorithms, giving readers an intuition of the capabilities but also limitations of machine learning, and how to apply those algorithms wisely. What's new in the second edition? SR: As time and the software world moved on after the first edition was released in September 2015, we decided to replace the introduction to deep learning via Theano. No worries, we didn't remove it! But it got a substantial overhaul and is now based on TensorFlow, which has become a major player in my research toolbox since its open source release by Google in November 2015. Along with the new introduction to deep learning using TensorFlow, the biggest additions to this new edition are three brand new chapters focussing on deep learning applications: A more detailed overview of the TensorFlow mechanics, an introduction to convolutional neural networks for image classification, and an introduction to recurrent neural networks for natural language processing. Of course, and in a similar vein as the rest of the book, these new chapters do not only provide readers with practical instructions and examples but also introduce the fundamental mathematics behind those concepts, which are an essential building block for understanding how deep learning works. What do you think is the most exciting trend in data science and machine learning? SR: One interesting trend in data science and machine learning is the development of libraries that make machine learning even more accessible. Popular examples include TPOT and AutoML/auto-sklearn. Or, in other words, libraries that further automate the building of machine learning pipelines. While such tools do not aim to replace experts in the field, they may be able to make machine learning even more accessible to an even broader audience of non-programmers. However, being to interpret the outcomes of predictive modeling tasks and being to evaluate the results appropriately will always require a certain amount of knowledge. Thus, I see those tools not as replacements but rather as assistants for data scientists, to automate tedious tasks such as hyperparameter tuning. Another interesting trend is the continued development of novel deep learning architectures and the large progress in deep learning research overall. We've seen many interesting ideas from generative adversarial neural networks (GANs), densely connected neural networks (DenseNets), and  ladder networks. Large profress has been made in this field thanks to those new ideas and the continued improvements of deep learning libraries (and our computing infrastructure) that accelerate the implementation of research ideas and the development of these technologies in industrial applications. How has the industry changed since you first started working? SR: Over the years, I have noticed that more and more companies embrace open source, i.e., by sharing parts of their tool chain in GitHub, which is great. Also, data science and open source related conferences keep growing, which means more and more people are not only getting interested in data science but also consider working together, for example, as open source contributors in their free time, which is nice. Another thing I noticed is that as deep learning becomes more and more popular, there seems to be an urge to apply deep learning to problems even if it doesn't necessarily make sense -- i.e., the urge to use deep learning just for the sake of using deep learning. Overall, the positive thing is that people get excited about new and creative approaches to problem-solving, which can drive the field forward. Also, I noticed that more and more people from other domains become more familiar with the techniques used in statistical modeling (thanks to "data science") and machine learning. This is nice, since good communication in collaborations and teams is important, and a given, common knowledge about the basics makes this communication indeed a bit easier. What advice would you give to someone who wants to become a data scientist? SR: I recommend starting with a practical, introductory book or course to get a brief overview of the field and the different techniques that exist. A selection of concrete examples would be beneficial for understanding the big picture and what data science and machine learning is capable of. Next, I would start a passion project while trying to apply the newly learned techniques from statistics and machine learning to address and answer interesting questions related to this project. While working on an exciting project, I think the practitioner will naturally become motivated to read through the more advanced material and improve their skill. What are the biggest misunderstandings and misconceptions people have about machine learning today? Well, there's this whole debate on AI turning evil. As far as I can tell, the fear mongering is mostly driven by journalists who don't work in the field and are apparently looking for catchy headlines. Anyway, let me not iterate over this topic as readers can find plenty of information (from both viewpoints) in the news and all over the internet. To say it with one of the earlier comments, Andrew Ng's famous quote: “I don’t work on preventing AI from turning evil for the same reason that I don’t work on combating overpopulation on the planet Mars." What's so great about Python? Why do you think it's used in data science and beyond? SR: It is hard to tell which came first: Python becoming a popular language so that many people developed all the great open-source libraries for scientific computing, data science, and machine learning or Python becoming so popular due to the availability of these open-source libraries. One thing is obvious though: Python is a very versatile language that is easy to learn and easy to use. While most algorithms for scientific computing are not implemented in pure Python, Python is an excellent language for interacting with very efficient implementations in Fortran, C/C++, and other languages under the hood. This, calling code from computationally efficient low-level languages but also providing users with a very natural and intuitive programming interface, is probably one of the big reasons behind Python's rise to popularity as a lingua franca in the data science and machine learning community. What tools, frameworks and libraries do you think people should be paying attention to? There are many interesting libraries being developed for Python. As a data scientist or machine learning practitioner, I'd especially want to highlight the well-maintained tools from Python core scientific stack: -       NumPy and SciPy as efficient libraries for working with data arrays and scientific computing -       Pandas to read in and manipulate data in a convenient data frame format -       matplotlib for data visualization (and seaborn for additional plotting capabilities and more specialized plots) -       scikit-learn for general machine learning There are many, many more libraries that I find useful in my project. For example, Dask is an excellent library for working with data frames that are too large to fit into memory and to parallelize computations across multiple processors. Or take TensorFlow, Keras, and PyTorch, which are all excellent libraries for implementing deep learning models. What does the future look like for Python? In my opinion, Python's future looks very bright! For example, Python has just been ranked as top 1 programming language by IEEE Spectrum as of July 2017. While I mainly speak of Python from the data science/machine learning perspective, I heard from many people in other domains that they appreciate Python as a versatile language and its rich ecosystem of libraries. Of course, Python may not be the best tool for every problem, it is very well regarded as a "productive" language for programmers who want to "get things done." Also, while the availability of plenty of libraries is one of the strengths of Python, I must also highlight that most packages that have been developed are still being exceptionally well maintained, and new features and improvements to the core data science and machine learning libraries are being added on a daily basis. For instance, the NumPy project, which has been around since 2006, just received a $645,000 grant to further support its continued developed as a core library for scientific computing in Python. At this point, I also want to thank all the developers of Python and its open source libraries that have made Python to what it is today. It's an immensely useful tool to me, and as Python user, I also hope you will consider getting involved in open source -- every contribution is useful and appreciated, small documentation fixes, bug fixes in the code, new features, or entirely new libraries. Again, and with big thanks to the awesome community around it,  I think Python's future looks very bright.
Read more
  • 0
  • 0
  • 13617

article-image-interview-hussein-nasser
Hussein Nasser
01 Jul 2014
4 min read
Save for later

An Interview with Hussein Nasser

Hussein Nasser
01 Jul 2014
4 min read
What initially drew you to write your book for Packt Publishing? In 2009, I started writing technical articles on my personal blog. I would write about my field, Geographic Information Systems, or any other technical articles. Whenever a new technology emerged, a new product,or sometimes even mere tips or tricks,I would write an article about it. My blog became a well-known site in GIS, and that is when Packt approached me with a proposed title. I always wanted to write a book but I never expected that the opportunity would knock on my door. I thank Packt for giving me that opportunity. When you began writing, what were your main aims? My main aim was to write a book that readers in my domain could grab and benefit from. While working on a chapter, I would always imagine a reader picking up the book and reading that particular chapter and asked myself, what could I do better? And then I tried to make the chapter as simple as possible and leave nothing unexplained. What did you enjoy most and what was most rewarding about the experience of writing? Think about all the knowledge, information, ideas, and tips that you possess. You knew you had it in you somewhere but you didn’t know the joy and delight you would feel when this knowledge slipped through your fingertips into a physical medium. With each reading I would reread and polish the chapters;it seems there is always room for improvement in writing. Why, in your opinion, is ArcGIS exciting to discover, read, and write about? ArcGIS is not a new technology; it has been around for more than 14 years. It has become mature and polished during these years. It has expanded and started touching other bleeding-edge technologies like mobile, web, and the cloud. Everyday this technology is increasingly worth discovering and everyday it benefits areas like health, utilities, transportation, and so on. Why do you think interest in GIS is on the rise? If you read The Tipping Point,by Malcolm T. Gladwell, you will understand that the smartphone was actually a tipping point for the GIS technology. GIS was only used by enterprises and big companies who wanted to add the location dimension to their tabular data so it helped them better visualize and analyze their information. With smartphones and GPS, geographic location became more relevant. Pictures taken with smartphones are tagged with location information. Applications were developed to harness the power of GIS for routing, finding the best restaurants in an area, calculating shortest routes, finding information based on geo-fencing technology that sends you text messages when you pass by a shop, and so on. The popularity of GIS is rising and so is the interest in adapting this technology. What do you see on the horizon for GIS? High end processing servers are being sent to the cloud while we are carrying smaller and smaller gadgets. Networking is getting stronger every day with the LTE and 4G networks already setup in many countries. Storage has become no issue at all. The Web architecture is dominant so far and it is the most open and compatible platform that has ever existed. As long as we keep using devices, we will need geographic information systems. The data can be consumed and fetched swiftly from anywhere in the world from the smallest device. I believe this will evolve to an extent that everything valuable we own can be tagged with a location, so when we misplace something or lose it, we can always use GIS to locate it. Any tips for new authors? My role model author is Seth Godin; the first book I ever read was his. When I told him about my new book and asked him for any advice he might give me as a new author, he told me and I quote,″Congratulations, Hussein .This is thrilling to hear; my only advice is to keep writing!″ I took his advice and now I′m working on my second book with Packt. Another personal tip I can give to new authors is thatwriting needs focus, and I find music the best soul feeding source. While working on my first book,I discovered this site www.stereomood.com, which plays music that will help you write. Another thing is to use a clutter free word processor application that will blank the entire screen so you are only left with your words. I use WriteMonkey for Windows and Focus writer for Mac.
Read more
  • 0
  • 0
  • 13084
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-git-like-all-other-version-control-tools-exists-to-solve-for-one-problem-change-joseph-muli-and-alex-magana-interview
Packt Editorial Staff
09 Oct 2018
5 min read
Save for later

“Git, like all other version control tools, exists to solve for one problem: change” - Joseph Muli and Alex Magana [Interview]

Packt Editorial Staff
09 Oct 2018
5 min read
An unreliable versioning tool makes product development a herculean task. Creating and enforcing checks and controls for the introduction, scrutiny, approval, merging, and reversal of changes in your source code, are some effective methods to ensure a secure development environment. Git and GitHub offer constructs that enable teams to conduct version control and collaborative development in an effective manner.  When properly utilized, Git and GitHub promote agility and collaboration across a team, and in doing so, enable teams to focus and deliver on their mandates and goals. We recently interviewed Joseph Muli and Alex Magana, the authors of Introduction to Git and GitHub course. They discussed the various benefits of Git and GitHub while sharing some best practices and myths. Author Bio Joseph Muli loves programming, writing, teaching, gaming, and travelling. Currently, he works as a software engineer at Andela and Fathom, and specializes in DevOps and Site Reliability. Previously, he worked as a software engineer and technical mentor at Moringa School. You can follow him on LinkedIn and Twitter. Alex Magana loves programming, music, adventure, writing, reading, architecture, and is a gastronome at heart. Currently, he works as a software engineer with BBC News and Andela. Previously, he worked as a software engineer with SuperFluid Labs and Insync Solutions. You can follow him on LinkedIn or GitHub. Key Takeaways Securing your source code with version control is effective only when you do it the right way. Understanding the best practices used in version control can make it easier for you to get the most out of Git and GitHub. GitHub is loaded with an elaborate UI. It’ll immensely help your development process to learn how to navigate the GitHub UI and install the octo tree. GitHub is a powerful tool that is equipped with useful features. Exploring the Feature Branch Workflow and other forking features, such as submodules and rebasing, will enable you to make optimum use of the many features of GitHub. The more elaborate the tools, the more time they can consume if you don’t know your way through them. Master the commands for debugging and maintaining a repository, to speed up your software development process. Keep your code updated with the latest changes using CircleCI or TravisCI, the continuous integration tools from GitHub. The struggle isn’t over unless the code is successfully released to production. With GitHub’s release management features, you can learn to complete hiccup-free software releases. Full Interview Why is Git important? What problem is it solving? Git, like all other version control tools, exists to solve for one problem, change. This has been a recurring issue, especially when coordinating work on teams, both locally and distributed, that specifically being an advantage of Git through hubs such as GitHub, BitBucket and Gitlab. The tool was created by Linus Torvalds in 2005 to aid in development and contribution on the Linux Kernel. However, this doesn’t necessarily limit Git to code any product or project that requires or exhibits characteristics such as having multiple contributors, requiring release management and versioning stands to have an improved workflow through Git. This also puts into perspective that there is no standard, it’s advisable to use what best suits your product(s). What other similar solutions or tools are out there? Why is Git better? As mentioned earlier, other tools do exist to aid in version control. There are a lot of factors to consider when choosing a version control system for your organizations, depending on product needs and workflows. Some organizations have in-house versioning tools because it suits their development. Some organizations, for reasons such as privacy and security or support, may look for an integration with third-party and in-house tools. Git primarily exists to provide for a faster and distributed version system, that is not tied to a central repository, hub or project. It is highly scalable and portable. Other VC tools include Apache SubVersion, Mercurial and Concurrent Versions System (CVS). How can Git help developers? Can you list some specific examples (real or imagined) of how it can solve a problem? A simple way to define Git’s indispensability is enabling fast, persistent and accessible storage. This implies that changes to code throughout a product’s life cycle can be viewed and updated on demand, each with simple and compact commands to enable the process. Developers can track changes from multiple contributors, blame introduced bugs and revert where necessary. Git enables multiple workflows that align to practices such as Agile e.g. feature branch workflows and others including forking workflows for distributed contribution, i.e. to open source projects. What are some best tips for using Git and GitHub? These are some of the best practices you should keep in mind while learning or using Git and GitHub. Document everything Utilize the README.MD and wikis Keep simple and concise naming conventions Adopt naming prefixes Correspond a PR and Branch to a ticket or task. Organize and track tasks using issues. Use atomic commits [box type="shadow" align="" class="" width=""]Editor’s note: To explore these tips further, read the authors’ post ‘7 tips for using Git and GitHub the right way’.[/box] What are the myths surrounding Git and GitHub? Just as every solution or tool has its own positives and negatives, Git is also surrounded by myths one should be aware of. Some of which are: Git is GitHub Backups are equivalent to version control Git is only suitable for teams To effectively use Git, you need to learn every command to work [box type="shadow" align="" class="" width=""]Editor’s note: To explore these tips further, read the authors’ post ‘4 myths about Git and GitHub you should know about’.  [/box] GitHub’s new integration for Jira Software Cloud aims to provide teams a seamless project management experience GitLab raises $100 million, Alphabet backs it to surpass Microsoft’s GitHub GitHub introduces ‘Experiments’, a platform to share live demos of their research projects  
Read more
  • 0
  • 0
  • 12421
Modal Close icon
Modal Close icon