Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

How-To Tutorials - ChatGPT

113 Articles
article-image-chatgpt-for-information-retrieval-and-competitive-intelligence
Valentina Alto
02 Jun 2023
2 min read
Save for later

ChatGPT for Information Retrieval and Competitive Intelligence

Valentina Alto
02 Jun 2023
2 min read
This article is an excerpt from the book Modern Generative AI with ChatGPT and OpenAI Models, by Valentina Alto. This book will provide you with insights into the inner workings of the LLMs and guide you through creating your own language models. Information retrieval and competitive intelligence are fields where ChatGPT is a game-changer. It can retrieve information from its knowledge base and reframe it in an original way.One example is using ChatGPT as a search engine to provide summaries, reviews, and recommendations for books:  Alternatively, we could ask for some suggestions for a new book we wish to read based on our preferences:  If we design the prompt with specific information, ChatGPT can serve as a tool for pointing us towards the right references for research or studies. For example, asking ChatGPT to list relevant references for feedforward neural networks:  ChatGPT can also be useful for competitive intelligence. For example, generating a list of existing books with similar content:  Or providing advice on how to be competitive in the market:  ChatGPT can also suggest improvements regarding book content to make it stand out:  Overall, ChatGPT can be a valuable assistant for information retrieval and competitive intelligence. However, it's important to remember that the knowledge base cutoff is 2021, so real-time information may not be available. About the AuthorValentina Alto graduated in 2021 in Data Science. Since 2020 she has been working in Microsoft as Azure Solution Specialist and, since 2022, she focused on Data&AI workloads within the Manufacturing and Pharmaceutical industry. She has been working on customers’ projects closely with system integrators to deploy cloud architecture with a focus on datalake house and DWH, data integration and engineering, IoT and real-time analytics, Azure Machine Learning, Azure cognitive services (including Azure OpenAI Service), and PowerBI for dashboarding. She holds a BSc in Finance and an MSc degree in Data Science from Bocconi University, Milan, Italy. Since her academic journey she has been writing Tech articles about Statistics, Machine Learning, Deep Learning and AI on various publications. She has also written a book about the fundamentals of Machine Learning with Python.  You can connect with Valentina on:LinkedInMedium
Read more
  • 0
  • 0
  • 4923

article-image-data-analysis-made-easy-with-chatgpt
Sagar Lad
02 Jul 2023
5 min read
Save for later

Data Analysis Made Easy with ChatGPT

Sagar Lad
02 Jul 2023
5 min read
Are you weary of trawling through heaps of analysis data in search of meaningful insights? With ChatGPT, the rules will soon alter. ChatGPT may reveal hidden patterns and trends in your data that you never imagined were there because of its sophisticated natural language processing skills. In this blog article, we'll look at how exploratory data analysis with ChatGPT can revolutionize your data and change the way you conduct business.Data Analysis with ChatGPTFor data analysts, ChatGPT can be a useful tool for processing, exploring, communicating, and collaborating on their data-driven ideas. Large volumes of data can be analyzed and processed by ChatGPT fast and effectively. Written inquiries can be interpreted and understood by ChatGPT through its language processing skills, which also allow it to extract pertinent insights from the data. Here are a few benefits that ChatGPT can provide: Data analysts can use ChatGPT to study their data, spot trends, and even produce useful data visualizations. The data is clearly outlined in these graphics, which makes it simpler for analysts to spot trends and insights. Data analysts can utilize ChatGPT to explain their findings to non-technical stakeholders. The chatbot can assist data analysts in providing simple explanations of complicated data ideas and insights by using natural language. Data analysts might benefit from ChatGPT's help in coming up with fresh, insightful queries to pose to their data. Analysts can investigate novel lines of inquiry and unearth previously unconsidered hidden insights by using natural language queries. Let's look at how chatGPT can make data analysis easy and straightforward. As a data modeler, I want to investigate the data's dictionary and metadata first. Image 1: : Data Dictionary Using ChatGPT, Part 1Image 2 : Data Dictionary Using ChatGPT, Part 2ChatGPT gives us thorough details about the data dictionary for each column, including a complete description of each column. The final user will benefit from this guidance on when and how to use the data.Asking chatGPT about the dataset's number of rows and columns will help you better grasp the overall statistics.  Image 2 : Dataset Statistics  As seen in the image above, chatGPT gives us a precise estimate of the dataset's number of rows and columns. After getting a broad overview of the dataset, let's examine the data's quality: Image 3 : Exploratory Data Analysis - Null Value StatisticsHere, we've given the chatGPT an input containing the dataset and requested it to determine the percentage of null values therein in order to determine whether the data can be used for analytics. The dataset does not contain any null values, hence chanGPT responds that the given dataset contains no missing values.Now, we can observe that the data set's header information is absent. Before we can use the data, the columns must contain meaningful data. Image 4 : Dataset Column Naming ConventionLet's ask chatGPT how it can deliver valuable header data. As you can see, the output of chatGPT is a column header with a description and business-specific naming standards. The technical team's and business users' lives are made easier in terms of using this data.We now know that the data quality is good. As this will affect the results of the data analysis, let's look for any outliers in the dataset. Image 5 : Detect Outliers in the DatasetIn this case, chatGPT is carrying out an in-depth analysis at the column level to see whether any outliers are there. It's okay if it doesn't exist. If it does, it also offers advice on what kind of outlier is present and how it can affect the entire data analysis procedure.Let's now look at how to use chatGPT to eliminate those outliers.Image 7 : Remove Outliers from the dataset using python, Part 1 Image 8 : Remove Outliers from the dataset using python, Part 2Therefore, for a given sample dataset, ChatGPT offers a thorough Python code that can be used to automatically eliminate the observed outliers. The team may have business analysts who are unfamiliar with Python. Let's see how chatGPT can assist business analysts with their data analysis work.Image 7 : SQL Query to calculate monthly revenue, Part 1Image 8 : SQL Query to calculate monthly revenueIn this case, chatGPT offers a default query that the business analyst may utilize to figure out the monthly income for a particular dataset. Let's then ask chatGPT to take on the role of a data analyst and offer further insights for a certain dataset. Image 8 : Step by Step Data Analysis using chatGPT, Part 1  Image 9 : Step-by-Step Data Analysis using ChatGPT, Part 2As we can see from the chatGPT's results, it offers us step-by-step advice on various studies and results that may be applied on top of this particular dataset. The execution of each of these tasks using chatGPT is possible for each phase of the overall data analysis process.Let's ask chatGPT to undertake this data analysis work so that it may use Python to analyze prices for the given dataset:Image 9 : Price Analysis using python, Part 1Image 10 : Price Analysis using python, Part 2Image 11 : Price Analysis using Python, Part 2For the purpose of doing price analysis on a given dataset, ChatGPT has developed a Python code and sample output. We can draw a judgment about how the prices are changing over time based on the data points at hand from this output.ConclusionIn this article, we go into great detail on how to use chatGPT for a variety of exploratory data analysis tasks. Additionally, we looked closely at different approaches to carrying out data analysis tasks using Python and SQL. ChatGPT is, in a word, a very useful tool for performing exploratory data analysis tasks while working with massive volumes of data.Author BioSagar Lad is a Cloud Data Solution Architect with a leading organization and has deep expertise in designing and building Enterprise-grade Intelligent Azure Data and Analytics Solutions. He is a published author, content writer, Microsoft Certified Trainer, and C# Corner MVP. Link - Medium , Amazon , LinkedIn 
Read more
  • 0
  • 0
  • 4898

article-image-ai-distilled-16-baidus-ernie-chatbot-openais-chatgpt-in-education-metas-facet-dataset-fmops-or-llmops-qualcomms-ai-focus-interecagent-liquid-neural-networks
Merlyn Shelley
08 Sep 2023
11 min read
Save for later

AI_Distilled #16: Baidu's Ernie Chatbot, OpenAI's ChatGPT in Education, Meta's FACET Dataset, FMOps or LLMOps, Qualcomm's AI Focus, InteRecAgent, Liquid Neural Networks

Merlyn Shelley
08 Sep 2023
11 min read
👋 Hello ,“Artificial intelligence is one of the most profound things we're working on as humanity. It is more profound than fire or electricity.” -Sundar Pichai, Google CEO  Pichai's AI-fire analogy signifies a transformative era; AI and ML will revolutionize education, medicine, and more, reshaping human progress. OpenAI has begun promoting the use of ChatGPT in education, which shouldn’t really come as a surprise as students the world over have been experimenting with the technology. Get ready to dive into the latest AI developments in this edition, AI_Distilled #16, including Baidu launching Ernie chatbot following Chinese government approval, X's Privacy Policy Reveals Plan to Use Public Data for AI Training, Meta releasing FACET Dataset to evaluate AI model fairness, Google’s new Multislice for scalable AI training on cloud TPUs, and Qualcomm's focus on AI and auto amidst NVIDIA's chip dominance. Watch out also for our handpicked collection of fresh AI, GPT, and LLM-focused secret knowledge and tutorials from around the web covering Liquid Neural Networks, Serverless Machine Learning with Amazon Redshift ML, implementing effective guardrails for LLMs, Navigating Generative AI with FMOps and LLMOps, and using Microsoft’s new AI compiler quartet. What do you think of this issue and our newsletter? Please consider taking the short survey below to share your thoughts and you will get a free PDF of the “The Applied Artificial Intelligence Workshop” eBook upon completion. Complete the Survey. Get a Packt eBook for Free!Writer’s Credit: Special shout-out to Vidhu Jain for their valuable contribution to this week’s newsletter content!  Cheers,  Merlyn Shelley  Editor-in-Chief, Packt  ⚡ TechWave: AI/GPT News & AnalysisMeta Releases FACET Dataset to Evaluate AI Model Fairness: Meta has launched FACET (FAirness in Computer Vision EvaluaTion), a dataset designed to assess the fairness of AI models used for image and video classification, including identifying people. Comprising 32,000 images with 50,000 labeled individuals, FACET includes demographic and physical attributes, allowing for deep evaluations of biases against various classes. Despite previous concerns about Meta's responsible AI practices, the company claims FACET is more comprehensive than previous bias benchmarks. However, concerns have been raised about the dataset's origins and the compensation of annotators. Meta has also released a web-based dataset explorer tool for FACET. You can read the full paper here. Baidu Launches Ernie Chatbot Following Chinese Government Approval: Chinese tech giant Baidu has unveiled its chatbot, Ernie Bot, after receiving government clearance, along with other AI firms. Ernie Bot is now accessible for download via app stores or Baidu's website. Similar to its rival, ChatGPT, users can engage Ernie Bot for queries, market analysis assistance, marketing slogan ideas, and document summaries. While it's accessible globally, registration requires a Chinese number, and the app is only in Chinese on US Android and iOS stores. Baidu has also introduced a plug-in market for Ernie Bot, which quickly garnered over 1 million users within 19 hours of launch. CEO Robin Li expressed plans for further AI-native apps aimed at exploring generative AI's core abilities. Google Introduces Multislice for Scalable AI Training on Cloud TPUs: Google has unveiled Multislice, a comprehensive large-scale training technology that facilitates straightforward, cost-effective, and nearly linear scaling to tens of thousands of Cloud Tensor Processing Units (TPUs) chips. Traditionally, a training run was restricted to a single slice, which meant a maximum of 3072 TPU v4 chips could be used. With Multislice, training can span multiple slices across pods through data center networking, eliminating these limitations. This innovation offers benefits such as efficient scaling for massive models, enhanced developer productivity, automatic compiler optimizations, and cost-efficiency. It promises to revolutionize AI infrastructure by enabling near-linear scaling for AI supercomputing. OpenAI Promotes Use of ChatGPT in Education: OpenAI is encouraging educators to utilize ChatGPT in classrooms. The company showcased six educators, primarily at the university level, using ChatGPT for various purposes, such as role-playing in debates, aiding translation for English-as-a-second-language students, and fact-checking. Despite some schools banning ChatGPT due to concerns about academic integrity, OpenAI believes it can be a valuable tool in education. However, it emphasizes the importance of maintaining human oversight in the assessment process. X's Privacy Policy Reveals Plan to Use Public Data for AI Training: In an update to its privacy policy, X (formerly Twitter) has informed users that it will now collect biometric data, job histories, and education backgrounds. However, another section of the policy reveals a broader plan: X intends to utilize the data it gathers, along with publicly available information, to train its machine learning and AI models. This revelation has attracted attention, particularly due to the connection with X owner Elon Musk's ambitions in the AI market through his company xAI. Musk confirmed the privacy policy change, emphasizing that only public data, not private messages, would be used for AI training.   Qualcomm's Focus on AI and Auto Amidst NVIDIA’s Chip Dominance: As NVIDIA takes the lead as the world's largest fabless chip company, Qualcomm is strategically positioning itself in the AI realm. The company has unveiled in-vehicle generative AI capabilities, expanded into two-wheelers, and forged a partnership with Amazon Web Services. Qualcomm's CEO, Cristiano Amon, believes that generative AI, currently reliant on cloud resources, will transition to local execution, enhancing performance and cost-efficiency. Diversification is also a priority, with Qualcomm's chips powering various smart devices, especially in the automotive sector. Amid uncertainty about its future relationship with Apple, Qualcomm aims to maintain its dominance through innovations in AI and auto tech. InteRecAgent, A Fusion of Language Models and Recommender Systems Introduced: Researchers from the University of Science and Technology of China, in collaboration with Microsoft Research Asia, have introduced InteRecAgent, a cutting-edge framework. This innovation seeks to combine the interactive capabilities of LLMs with the domain-specific precision of traditional recommender systems. Recommender systems play a vital role in various digital domains, but they often struggle with versatile interactions. On the other hand, LLMs excel in conversations but lack domain-specific knowledge. InteRecAgent introduces the "Candidate Memory Bus" to streamline recommendations for LLMs and a "Plan-first Execution with Dynamic Demonstrations" strategy for effective tool interaction. adidas Utilizes AI and NVIDIA RTX for Photorealistic 3D Content: Sportswear giant adidas is partnering with Covision Media, an Italian startup, to revolutionize their online shopping experience. Covision employs AI and NVIDIA RTX technology to develop 3D scanners that allow businesses to create digital twins of their products with stunning realism. This technology can quickly generate 3D scans, capturing textures, colors, and geometry, resulting in lifelike images. adidas is among the first to adopt this technology for automating and scaling e-commerce content production, enhancing their Virtual Try-On feature and replacing traditional product photography with computer-generated content.  🔮 Expert Insights from Packt CommunityServerless Machine Learning with Amazon Redshift ML - By Debu Panda, Phil Bates, Bhanu Pittampally, Sumeet JoshiData analysts and developers use Redshift data with machine learning (ML) models for tasks such as predicting customer behavior. Amazon Redshift ML streamlines this process using familiar SQL commands. A conundrum arises when attempting to decipher these data silos – a formidable challenge that hampers the derivation of meaningful insights essential for organizational clarity. Adding to this complexity, security and performance considerations typically prevent business analysts from accessing data within OLTP systems. The hiccup is that intricate analytical queries weigh down OLTP databases, casting a shadow over their core operations. Here, the solution is the data warehouse, which is a central hub of curated data, used by business analysts and data scientists to make informed decisions by employing the business intelligence and machine learning tools at their disposal. These users make use of Structured Query Language (SQL) to derive insights from this data trove. Here’s where Amazon Redshift Serverless comes in. It’s a key option within Amazon Redshift, a well-managed cloud data warehouse offered by Amazon Web Services (AWS). With cloud-based ease, Amazon Redshift Serverless lets you set up your data storage without infrastructure hassles or cost worries. You pay based on what you use for compute and storage. Amazon Redshift Serverless goes beyond convenience, propelling modern data applications that seamlessly connect to the data lake. The above content is extracted from the book Serverless Machine Learning with Amazon Redshift ML written by Debu Panda, Phil Bates, Bhanu Pittampally, Sumeet Joshi and published in Aug 2023. To get a glimpse of the book's contents, make sure to read the free chapter provided here, or if you want to unlock the full Packt digital library free for 7 days, try signing up now! To learn more, click on the button below. Keep Calm, Start Reading! 🌟 Secret Knowledge: AI/LLM ResourcesUnderstanding Liquid Neural Networks: A Primer on AI Advancements: In this post, you'll learn how liquid neural networks are transforming the AI landscape. These networks, inspired by the human brain, offer a unique and creative approach to problem-solving. They excel in complex tasks such as weather prediction, stock market analysis, and speech recognition. Unlike traditional neural networks, liquid neural networks require significantly fewer neurons, making them ideal for resource-constrained environments like autonomous vehicles. These networks excel in handling continuous data streams but may not be suitable for static data. They also provide better causality handling and interpretability. Navigating Generative AI with FMOps and LLMOps: A Practical Guide: In this informative post, you'll gain valuable insights into the world of generative AI and its operationalization using FMOps and LLMOps principles. The authors delve into the challenges businesses face when integrating generative AI into their operations. You'll explore the fundamental differences between traditional MLOps and these emerging concepts. The post outlines the roles various teams play in this process, from data engineers to data scientists, ML engineers, and product owners. The guide provides a roadmap for businesses looking to embrace generative AI. AI Compiler Quartet: A Breakdown of Cutting-Edge Technologies: Explore Microsoft’s groundbreaking "heavy-metal quartet" of AI compilers: Rammer, Roller, Welder, and Grinder. These compilers address the evolving challenges posed by AI models and hardware. Rammer focuses on optimizing deep neural network (DNN) computations, improving hardware parallel utilization. Roller tackles the challenge of memory partitioning and optimization, enabling faster compilation with good computation efficiency. Welder optimizes memory access, particularly vital as AI models become more memory-intensive. Grinder addresses complex control flow execution in AI computation. These AI compilers collectively offer innovative solutions for parallelism, compilation efficiency, memory, and control flow, shaping the future of AI model optimization and compilation.  💡 MasterClass: AI/LLM Tutorials Exploring IoT Data Simulation with ChatGPT and MQTTX: In this comprehensive guide, you'll learn how to harness the power of AI, specifically ChatGPT, and the MQTT client tool, MQTTX, to simulate and generate authentic IoT data streams. Discover why simulating IoT data is crucial for system verification, customer experience enhancement, performance assessment, and rapid prototype design. The article dives into the integration of ChatGPT and MQTTX, introducing the "Candidate Memory Bus" to streamline data testing. Follow the step-by-step guide to create simulation scripts with ChatGPT and efficiently simulate data transmission with MQTTX.  Revolutionizing Real-time Inference: SageMaker Unveils Streaming Support for Generative AI: Amazon SageMaker now offers real-time response streaming, transforming generative AI applications. This new feature enables continuous response streaming to clients, reducing time-to-first-byte and enhancing interactive experiences for chatbots, virtual assistants, and music generators. The post guides you through building a streaming web application using SageMaker real-time endpoints for interactive chat use cases. It showcases deployment options with AWS Large Model Inference (LMI) and Hugging Face Text Generation Inference (TGI) containers, providing a seamless, engaging conversation experience for users. Implementing Effective Guardrails for Large Language Models: Guardrails are crucial for maintaining trust in LLM applications as they ensure compliance with defined principles. This guide presents two open-source tools for implementing LLM guardrails: Guardrails AI and NVIDIA NeMo-Guardrails. Guardrails AI offers Python-based validation of LLM responses, using the RAIL specification. It enables developers to define output criteria and corrective actions, with step-by-step instructions for implementation. NVIDIA NeMo-Guardrails introduces Colang, a modeling language for flexible conversational workflows. The guide explains its syntax elements and event-driven design. Comparing the two, Guardrails AI suits simple tasks, while NeMo-Guardrails excels in defining advanced conversational guidelines. 🚀 HackHub: Trending AI Toolscabralpinto/modular-diffusion: Python library for crafting and training personalized Diffusion Models with PyTorch.  cofactoryai/textbase: Simplified Python chatbot development using NLP and ML with Textbase's on_message function in main.py. microsoft/BatteryML: Open-source ML tool for battery analysis, aiding researchers in understanding electrochemical processes and predicting battery degradation. facebookresearch/co-tracker: Swift transformer-based video tracker with Optical Flow, pixel-level tracking, grid sampling, and manual point selection. explodinggradients/ragas: Framework evaluates Retrieval Augmented Generation pipelines, enhancing LLM context with external data using research-based tools. 
Read more
  • 0
  • 0
  • 4824

article-image-ai-in-the-real-world-insurance
Julian Melanson
21 Jul 2023
5 min read
Save for later

AI in the Real World: Insurance

Julian Melanson
21 Jul 2023
5 min read
As the relentless tide of technological advancement swells, the insurance industry, among many others, is facing a pivotal transformation. The inception and evolution of insurance technology or "insurtech" mandates that insurance agents, brokers, and companies diligently adapt and assimilate novel tools and methodologies to augment their operational efficiency and competitiveness. Of the emerging technologies, the innovative language model ChatGPT, conceived and developed by OpenAI, is showing significant potential to redefine the landscape of the insurance industry.This powerful AI model offers a diverse suite of services, each capable of improving the lives of insurance agents in numerous ways. From perfecting customer service and streamlining underwriting to improving data analytics and fraud detection, ChatGPT opens a Pandora's box of possibilities. Yet, the efficacy and feasibility of these innovative solutions call for a judicious understanding of the technology's strengths and limitations.The advantages of AI in InsuranceFirstly, customer service, the linchpin of the insurance business, is an area that stands to gain substantially from the implementation of ChatGPT. Insurance products and processes, notoriously labyrinthine to the average consumer, are sources of frequent queries and uncertainties. By employing ChatGPT, insurance firms can automatically answer routine questions related to policy details, billing, claims statuses, and more, in an array of languages. In doing so, it significantly alleviates the burden on customer service agents and concurrently boosts customer engagement.Such automated systems also find favor among modern consumers, with reports suggesting a notable preference for chatbot interactions. ChatGPT, with its impressive capabilities in generating human-like text responses, can amplify the effectiveness of customer service chatbots. These enhancements invariably lead to increased customer satisfaction, freeing up human agents to tackle more complex customer concerns. Furthermore, ChatGPT's natural language processing prowess can be harnessed to guide customers on suitable insurance products and services, digitizing sales and distribution.The underwriting process, a traditionally time-consuming task characterized by risk evaluation, is another sector ripe for the automation that ChatGPT brings. While Artificial Intelligence (AI) and Machine Learning (ML) models have previously been employed to improve the accuracy of risk assessment, gaps in data and information remain problematic. ChatGPT addresses this issue by enhancing data collection and analysis, investigating digital resources for analogous cases, and speeding up the identification of risk patterns.Through this sophisticated data analysis, ChatGPT can evaluate factors like a customer's age, financial status, occupation, and lifestyle, thereby determining their risk profile. This information enables insurers to offer personalized coverage and pricing, improving customer experience, and streamlining underwriting. In addition, it can alert insurers about high-risk individuals or circumstances, proactively averting potential losses. This automatic evaluation brings with it many questions around AI and ethics (which you can read more about here) but the advantages of getting such a system working are clear.Claims processing, an insurance operation infamous for its high cost and low level of digitization, is another area primed for disruption by ChatGPT. The AI model can proficiently extract and categorize information from claims forms and other documents, drastically reducing the laborious and time-intensive task of manual data entry.A significant advantage arising from automating claims processing is its potential in fraud detection. With estimates by the FBI suggesting that insurance fraud costs American families hundreds of dollars each year in premium hikes, the value of early fraud detection cannot be overstated. ChatGPT can help in showing patterns of inconsistencies in claim forms, then flagging suspicious entries for human review. By alerting insurers to both overt and covert attempts at fraud, ChatGPT can help save billions of dollars annually.Reasons to be carefulAs a caveat, while the utility and advantages of ChatGPT in the insurance industry are substantial, one must consider the nascent stage of this technology. Its real-world impact will be contingent on its successful integration within existing processes and wide-ranging adoption. Moreover, while AI systems offer remarkable capabilities, they are not infallible and require human supervision. Thus, the technology's explainability, its transparency, and its limitations should be carefully considered and understood.In summary, the potential of ChatGPT to transform the insurance industry is vast, promising efficiency gains, cost reductions, and enhanced customer service. But realizing these advantages requires industry-wide receptiveness, careful integration, and judicious application, along with a respect for the limitations of the technology.Author BioJulian Melanson is one of the founders of Leap Year Learning. Leap Year Learning is a cutting-edge online school that specializes in teaching creative disciplines and integrating AI tools. We believe that creativity and AI are the keys to a successful future and our courses help equip students with the skills they need to succeed in a continuously evolving world. Our seasoned instructors bring real-world experience to the virtual classroom and our interactive lessons help students reinforce their learning with hands-on activities.No matter your background, from beginners to experts, hobbyists to professionals, Leap Year Learning is here to bring in the future of creativity, productivity, and learning!
Read more
  • 0
  • 0
  • 4225

article-image-chatgpt-as-an-assistant-for-plc-programmers
M.T White
13 Jun 2023
10 min read
Save for later

ChatGPT as an Assistant for PLC Programmers

M.T White
13 Jun 2023
10 min read
ChatGPT has been making the rounds recently in the world of programming.  In short, there is a lot of hoopla about whether or not it will replace programmers, how well it writes code, how well it troubleshoots issues, and more importantly how programmers can leverage it to automate code production. One area where ChatGPT can be of particular use is in the field of automation.  As all automation programmers know, the automation industry is very fast-paced.  This means that being able to quickly produce quality code is vital to the success of not only the programmer but the company as a whole.  However, it is often the case that some of the simplest code that a developer will need to produce is often the most time-consuming to write.  Two areas that come to mind are large case statements and sorting algorithms.  As such, the following article is going to explore how ChatGPT can be used to produce code that would otherwise be monotonous or time-consuming to write.  More specifically, the following is going to explore how ChatGPT can be used to generate the necessary code for a case statement that displays a motor’s health and how it can be used to select and generate the code for a sorting algorithm to sort part sizes. A CASE Statement to Detect Motor HealthA very daunting and time-consuming task that all automation programmers will perform regularly is creating large case statements.  It is not uncommon for automation programmers to have to build things like state machines that have 10s or even 100s of different cases.  As anyone who has built these before knows, building structures like these is time-consuming and monotonous at best.  For example, suppose a developer is working on a function block that will display a custom message about a motor error in an HMI.  Assume in this example the developer needs a code block that can handle the input/output exchange for the following table,Input valueString Message1Cable Error2Motor Overheated3Under CurrentDepending on the experience level of the developer a simple case statement program that can handle the requirements would take about five minutes.  However, when there are more case statements, for example, 30 or 40 inputs, the timeframe for completion could drastically increase.  Simply put, since there are more cases there will be more code to write.  In turn, this means that it will take the developer more time to type out the code and there will be a higher probability of introducing a bug or bugs in the code due to human error.  On the other hand, to quickly generate a quality code block that can meet these requirements all one would need to do is enter the following into ChatGPT:write an IEC 61131-3 code block that will set a message variable to "cable error" when the input is 1, "motor overheated" when the input is 2, and "under current" when the input is 3This statement will output the following:FUNCTION_BLOCK SetMessage VAR_INPUT    Input: INT; // Input value END_VAR VAR_OUTPUT    Message: STRING(50); // Output message END_VAR METHOD SetMessage VAR    TempMessage: STRING(50); // Temporary message variable END_VAR CASE Input OF    1:        TempMessage := 'Cable error';    2:        TempMessage := 'Motor overheated';    3:        TempMessage := 'Under current';    ELSE        TempMessage := 'Invalid input'; // Optional: default message for other input values END_CASE Message := TempMessage; // Assign the temporary message to the output message END_METHOD From the time it took to input and run the query on ChatGPT to the final code output was about one minute.  The code that was generated is quality code that looks like it was produced by an experienced programmer, it is documented, and above all else works.  The code even has an optional else statement to catch invalid inputs.  Overall, the autogenerated code is no less than what would be expected from a human developer.  With that, let’s explore how ChatGPT can be used to generate a more complex piece of software.   An Efficient Sorting Algorithm to Sort Parts by SizeIt is not uncommon for developers to be tasked with finding the smallest/largest or lightest/heaviest part produced on a production run.  For a task like this, it is common to store all the sizes, weights, or whatever it may be in an array and at one point or another sort the array to find the greatest and least size.  Depending on the size of the array, or in the case of this scenario, the number of part sizes to sort, having an efficient array is vital.  For this task, an efficient sorting algorithm will need to be picked and the code will need to be implemented. Assuming the PLC programmer is knowledgeable of sorting algorithms and Big O notations selecting an algorithm may only take a few minutes.  However, for a PLC programmer that is not versed in these topics, such as a person that is perhaps and mechanical engineer or a technician of some type, this task will probably require a Google search which ideally will take about five minutes.  Regardless, of whether the programmer is experienced or not with the algorithm it will take roughly 10 to 20 minutes for the person to type in the code.  As such, on a good day, it would take roughly 10 to 25 minutes for someone to implement a sorting algorithm. On the other hand, if a person used ChatGPT all they would have to do is type in the following query:write an IEC 61131-3 Structured Text sorting algorithm that can sort the elements in an array with 100 entries with the best possible Big O timeWhich will generate the following:FUNCTION_BLOCK QuickSort VAR_INPUT    arr: ARRAY[1..100] OF INT; END_VAR METHOD SortArray : BOOL    SortPartition(arr, 1, 100);    SortArray := TRUE; END_METHOD METHOD SortPartition(arr: ARRAY[1..100] OF INT; low: INT; high: INT) VAR    pivot: INT;    partitionIndex: INT; END_VAR    IF low < high THEN        pivot := Partition(arr, low, high, low); // Choosing pivot as the first element        SortPartition(arr, low, pivot - 1);        SortPartition(arr, pivot + 1, high);    END_IF; END_METHOD METHOD Partition(arr: ARRAY[1..100] OF INT; low: INT; high: INT; pivotIndex: INT) : INT VAR    pivotValue: INT;    i: INT;    partitionIndex: INT;    temp: INT; END_VAR    // Swap pivot element to the end    temp := arr[high];    arr[high] := arr[pivotIndex];    arr[pivotIndex] := temp;    pivotValue := arr[high];    partitionIndex := low;    FOR i := low TO high - 1 DO        IF arr[i] <= pivotValue THEN            // Swap elements            temp := arr[i];            arr[i] := arr[partitionIndex];            arr[partitionIndex] := temp;            partitionIndex := partitionIndex + 1;        END_IF;    END_FOR;    // Move pivot element to its correct position    temp := arr[partitionIndex];    arr[partitionIndex] := arr[high];    arr[high] := temp;    Partition := partitionIndex; END_METHOD END_FUNCTION_BLOCKAll of this code was generated in about one minute.  As can be seen, ChatGPT generated roughly 60 lines of code not counting the code that utilizes the example.  The only overhead that would be required by the programmer is to create the necessary files and paste the code in.  In terms of efficiency, ChatGPT rendered the Quicksort algorithm which has a Big O of n*log(n).  So, in the end, the programmer is probably looking at around five minutes' worth of work to implement an efficient algorithm that would otherwise take up to half an hour by hand.The Key to using ChatGPTAs can be seen, the key to using ChatGPT as a programming assistant is simply learning to talk to it.  Unlike programming languages that have a unique syntax, ChatGPT can interpret normal, human language.  However, to get the most out of ChatGPT the programmer needs to learn to ask detailed questions. The general rule of thumb is the more detailed a statement is the more detailed the solution will be.  As was seen, in the examples, all the code was produced with a single sentence.  Depending on the resolution needed, a simple query can produce great results, but it should be noted that if specs like specific ports, and so on need to be addressed the user should specify those.  Though novel now, it is likely that the art of talking to these systems will soon appear to generate optimal code with a query. SummaryIn all, ChatGPT can be used as a tool to help speed up the development process.  As was explored with the motor health code block and the sorting algorithm, ChatGPT can turn a simple phrase into workable code that would take a human a considerable amount of time to type out.  In short, even if a PLC programmer is knowledgeable of both programming principles, algorithms, and other computer science concepts they will always be bottlenecked by having to implement the code if they cannot simply cut and paste it from another source. When used in a way that was explored, ChatGPT is a great productivity tool.  It can be used to greatly reduce the amount of time needed to implement and if necessary find a solution.  Overall, ChatGPT needs guidance to arrive at a proper solution and the person driving the AI needs to be competent enough to implement the solution.  However, when in the right hands ChatGPT and similar AI systems can greatly improve development time.Author BioM.T. White has been programming since the age of 12. His fascination with robotics flourished when he was a child programming microcontrollers such as Arduino. M.T. currently holds an undergraduate degree in mathematics, and a master's degree in software engineering, and is currently working on an MBA in IT project management. M.T. is currently working as a software developer for a major US defense contractor and is an adjunct CIS instructor at ECPI University. His background mostly stems from the automation industry where he programmed PLCs and HMIs for many different types of applications. M.T. has programmed many different brands of PLCs over the years and has developed HMIs using many different tools.Author of the book: Mastering PLC Programming
Read more
  • 0
  • 0
  • 2524

article-image-ai-in-the-real-world-real-estate
Julian Melanson
20 Jul 2023
4 min read
Save for later

AI in the Real World: Real Estate

Julian Melanson
20 Jul 2023
4 min read
The fast-paced development of Artificial Intelligence has already started reshaping various sectors, with the real estate industry standing out as a prominent beneficiary. Of particular interest is the potential AI presents in streamlining property valuation, a critical process that underlies a myriad of real estate activities, including setting sale prices, making investment decisions, and optimizing home insurance premiums. While the conventional means of property valuation have their merits, they are far from perfect. This article delves into the potential of AI, specifically OpenAI's ChatGPT, in transforming property valuation in the real estate sector, discussing the challenges inherent to traditional approaches and exploring the benefits offered by this AI-driven approach.The Current State of Property ValuationProperty valuation is a meticulous process that draws on a variety of data sources, both public and private. Depending on the valuation's purpose, the time and effort committed to research can differ significantly. For instance, real estate brokers might base their Broker Price Opinions on a limited set of comparable properties, while appraisers might undertake a thorough firsthand inspection to understand a property's condition, quality, and value comprehensively.Despite the evolution of valuation methodologies over the years, traditional approaches still grapple with certain obstacles. One of the primary issues is data inconsistency, mainly arising from the dispersed and scattered nature of relevant property data across various sources. While attempts have been made to centralize information on property features, ownership changes, and other key insights, consistency in data remains elusive. The result is disparities in the Automated Valuation Models (AVMs) currently used, which can lead to divergent valuations for the same property.Moreover, human bias forms a significant challenge in property appraisals. It's often difficult to find identical properties for comparison, leading to inevitable subjectivity in adjustments made to reconcile price differences. Studies show that appraised values fall below the agreed purchase price in just 10% of cases, suggesting a propensity towards price confirmation bias, a situation that calls for greater objectivity in home appraisals.Integrating AI into Property Valuation: The Role of ChatGPTIn response to these challenges, the integration of AI into the property valuation process presents a promising solution. The application of AI, especially advanced language models like ChatGPT, can offer consistent examinations of a property’s condition and quality, mitigating issues associated with data inconsistencies and human bias.ChatGPT, a generative pre-trained transformer, has been designed to understand and generate human-like text based on given input. In the context of real estate, it offers tremendous potential in data analysis and, consequently, in generating accurate property valuations. Traditionally, property valuations have been conducted by human appraisers who assess a property’s worth based on a range of factors such as location, size, and condition. However, this approach can be time-consuming, costly, and susceptible to human error.By incorporating ChatGPT into the valuation process, real estate professionals can input relevant data into the AI model, which can then analyze the data and supply a detailed valuation report. The implications of this are transformative for the industry: it offers considerable time savings, reduces the potential for errors, and enhances the transparency of the valuation process.A Practical Application of ChatGPT in Property ValuationHere’s a very simple prompt that illuminates how ChatGPT can be a great guide in the property valuation process:    The evolution of AI has unlocked numerous opportunities for innovation and efficiency across a variety of sectors, with the real estate industry being no exception. Particularly, the advent of AI models like ChatGPT has opened new avenues for enhancing the accuracy and efficiency of property valuations. By surmounting the obstacles inherent to traditional valuation methodologies, such as data inconsistencies and human bias, AI offers a more streamlined, transparent, and precise approach to property valuation. Author BioJulian Melanson is one of the founders of Leap Year Learning. Leap Year Learning is a cutting-edge online school that specializes in teaching creative disciplines and integrating AI tools. We believe that creativity and AI are the keys to a successful future and our courses help equip students with the skills they need to succeed in a continuously evolving world. Our seasoned instructors bring real-world experience to the virtual classroom and our interactive lessons help students reinforce their learning with hands-on activities.No matter your background, from beginners to experts, hobbyists to professionals, Leap Year Learning is here to bring in the future of creativity, productivity, and learning!
Read more
  • 0
  • 0
  • 2429
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-exploring-openai-chatgpt-as-a-researcher-a-writer-and-a-software-user
Urszula Witherell
22 Jun 2023
6 min read
Save for later

Exploring OpenAI ChatGPT as a Researcher, a Writer and a Software User

Urszula Witherell
22 Jun 2023
6 min read
Artificial Intelligence is now a hot topic. The concept is not that new. In translations from one language to another, it was used since the 1960s to prepare rough drafts with the idea that the text will be refined by human editors. We interviewed Urszula Witherell, the author of the book Adobe Acrobat Ninja for her perspectives on using AI, specifically freely available ChatGPT: as a Reader, a Writer, and a Software User. What is your experience using ChatGPT as a Reader/Researcher?I am a curious person, and my mind constantly questions where and why. I love exploring new ideas, learning about people living in different cultures, and exploring different perspectives on topics that matter to me. AI as a research tool is wonderful, although very quickly I learned about its limitations. A simple test was to search for my own name. It knew nothing. My ego was not bruised as I am not a celebrity or want to be one. On the other hand, a simple search on Google very quickly found my name, my address, my family members, and everything I did online going back at least a decade. ChatGPT insisted that it knew nothing about me since its training ended in September 2021, 2 years ago. Similarly, when asked about a company website that has been active for at least 20 years it gave this reply: “I apologize for the confusion, but as an AI language model, I don't have the capability to browse the internet or visit specific websites. My responses are based on pre-existing knowledge up until September 2021. Therefore, I'm unable to provide a real-time description or analysis of the website “ or “Without specific information, I can't provide details about the website's content, purpose, or any other features it may have. It's always a good idea to visit the website directly to gather accurate and up-to-date information.” This made me ask a question how is ChatGPT trained? Where does the data come from?Once I manually copied information from the website and asked ChatGPT to generate a new response. It expanded on dry data and formulated a nice description of the website content.Image 1: ChatGPT cannot figure out what is the content of the websiteFigure 2: ChatGPT expanded the descriptionWhen I asked for descriptions of cities, landmarks, or public figures ChatGPT did provide sufficient information in full sentences with rich vocabulary. When asked to regenerate responses it provided fresh and even better-formulated descriptions. Again, not knowing where the data for AI come from, I depended on what I already knew about the locations to evaluate the accuracy of the response. It made me think about the famous quote: “Trust but verify”. So, compare results from multiple other sources to get the results that you can trust. Asked about liking a city, this is what I got: “As an AI language model, I don't have personal opinions or emotions, so I don't have the ability to like or dislike anything. However, I can provide information.” Not quite a chatty, opinionated friend, at least not yet.What is your experience using ChatGPT as a Writer?I think this is where the AI can be very helpful as it can be a thesaurus, spell-checker, and grammar checker all at once. It can re-write sentences; from just a rough draft it can expand descriptions and literally be your personal assistant during the writing process. You can see this in the screenshot example. As a bilingual person I often wonder how to express ideas in Polish for my friends in Poland. I discovered that translations of English text to Polish are very good. Grammar is correct, vocabulary is rich enough and sentences are full and mostly sound natural. Much better than what I had experienced in the past using Google Translate, when translated text needed a real human translator to be understood by a Polish-only speaking person (my parents). Of course, there are some flaws, such as the assumption of the male gender as an author. In Polish all nouns are assigned one of three genders: feminine (yes, ladies first), masculine and neutral. This is where corrections were needed. Additionally, since an example to be translated was a chapter of the software guide, it used many English terms in Polish grammatical form. My English teacher would faint at such a transgression, as we were always taught to use native expressions in place of foreign adoptions. But since technology vocabulary originated in English, so the rest of the world must just deal with it. I can live with that. I will depend on ChatGPT as a writer from now on. Definitely thumbs up!What about the perspective of a software user?Well, this is where the results vary, and it simply depends on what you are looking for. I tested a question about authoring Adobe InDesign documents for PDF accessibility and compliance with Section 508 requirements in the USA. ChatGPT provided a very detailed, step-by-step description of how to proceed, but in responses regenerated multiple times it missed the most advanced method when automated alt-text tags can be generated from image metadata using Object Styles. So again, AI helped, but I needed to know already what I could possibly do. Great help from a writer’s perspective, but not so good for someone who is exploring more advanced software functions.It is exciting to explore new platforms and technology. Like any new frontier, it brings excitement but also fear. And again, I am reminded that we as humans have not changed at all. Our toys changed, and we became more sophisticated and powerful, but we continue to need guidance on how to use yet another newly found power and knowledge. History is not a strong enough deterrent from misuse.SummaryIn conclusion, the interview with Urszula sheds light on the versatile applications of ChatGPT in the roles of a researcher, writer, and software engineer. Urszula highlights the efficiency and creativity enabled by ChatGPT, allowing her to tackle complex research problems, generate high-quality content, and even assist in software development tasks. Through her experiences, we gain insights into the immense potential of ChatGPT as a powerful tool for augmenting productivity and innovation across various domains. As the technology continues to evolve, researchers and professionals can leverage ChatGPT to unlock new possibilities and achieve greater efficiency in their work.Author BioUrszula is an expert graphic designer, software instructor, and consultant. As an instructor, she taught thousands of Adobe software users in government agencies and private corporations. Her easy-going and effective presentation style of complex features earned her many loyal clients. Her consulting work related to PDF included creating templates, editing files, and providing recommendations on the document production process. The final goal was to ensure that each publication met the highest publishing and accessibility standards, which was achieved successfully. Urszula also designed training curriculum and reference materials on proper document construction prior to conversion to PDF in a relevant authoring software, such as MS Word, Adobe InDesign, and FrameMaker.Author of the book: Adobe Acrobat Ninja
Read more
  • 0
  • 0
  • 1850

article-image-infusing-old-school-powerpoint-presentations-with-gpt-powers
Andrei Gheorghiu
08 Jun 2023
9 min read
Save for later

Infusing Old-school PowerPoint Presentations with GPT Powers

Andrei Gheorghiu
08 Jun 2023
9 min read
IntroductionRumor has it that around 30 million PowerPoint presentations are being created each day. Oh, PowerPoint, the sturdy old friend in the world of presentations. You might ask, why, in this fast-paced, ever-evolving digital world, do people still reach out for PowerPoint? Well, let me break it down for you, because honestly, PowerPoint has still got game! It’s simple, universal (in the realm of presentations), easy to learn and pretty well integrated with the other two Microsoft musketeers (Word & Excel).So, sure, there are fancier new alternatives out there (like Prezi or Canva), each with their unique features and slick interfaces. But when it comes to a tried-and-true solution for your presentation needs, you might find that, like a well-loved pair of sneakers, PowerPoint fits just right.And here’s the news: the little AI based automation I am going to show you in this article might actually change your mind about the capabilities of PowerPoint and make you reconsider using much more modern alternatives for your presentations.Working as a trainer for the last 15 years, I’ve probably seen more PowerPoint than other people do. I used it daily both for studying and teaching, whether for 15-minute presentations or 5-day long courses. I’ve always thought the greatest power of PowerPoint was not in its UI, design features or the number of export formats but rather in the VBA (Visual Basic for Applications) core functionality it shares with the other tools in the Microsoft ecosystem. And yet, so much potential lays unused in this regard.Oh, did you think that VBA is dead? I’m going to prove you wrong.A key characteristic for any slide-based presentation is that the viewer only gets a glimpse of the actual knowledge in the slide. It’s more visually appealing that way, but you need a presenter to understand the story and connect the dots. And usually, the presenter needs to use some slide notes for a more cursive and natural presentation. Slide notes are like a secret teleprompter. They give you reminders, help you with the flow of your presentation, and keep you on point. They're your discreet cheat-sheet, making sure you never miss a beat during your presentation.In this article, I am going to show you how, with just a few lines of code, you can infuse your existing PowerPoint presentations with the ability to automatically generate slide notes based on your slide contents.How it worksMy approach might seem a bit crude but it is effective. Especially if your presentation contains general concepts that might fall within the knowledge span of GPT3.5 Turbo from OpenAI (or even GPT-4 if you have an API key). More niche topics, or a slide having just a single bullet of words without too much context, may not produce the best results. Afterall, it’s all automated and no matter how smart the AI model may be, it still needs context to understand the task.For those who might be unfamiliar with the GPT3.5 Turbo API: this API provides programmatic access to the same advanced language model used by ChatGPT. Its purpose in the script is to explain the concepts in a PowerPoint slide.So, the basic approach is to create a VBA macro in the presentation that performs the following:Extracts all text from the current slide.Formats the entire text and builds a specially designed prompt with it.Submits the prompt to OpenAI’s API for processing.Appends the AI answer received from the API to the current slide’s notes.Because it’s a macro, once you’ve created it you can assign a shortcut and repeat the process for each slide.Alternatively, with a bit of tweaking you could adjust the code to iterate through all the slides and create all slide notes in one go. But keep in mind there is a cost for every API call you make and for a large presentation, the cost might easily spin out of control. BeforeAfterImagine the amount of time you can save by starting with some quick AI-generated draft notes and then reviewing and adding more context instead of having to type everything in from scratch. This productivity hack can save you hours of your time or even help you make sense of presentations created by someone else that lack any explanations or trainer notes. Before you beginSome things you need to take care of before you start writing the code:Make sure you store your OpenAI API key in a local environment variable for secure and efficient access. The code works on the assumption that the API key is stored in your local environment (OPENAI_API_KEY).Enable the Developer tab on your PowerPoint. By default, this tab is not visible. To enable it and be able to create VBA code you need to go to File > Options > Customize Ribbon and select Main Tabs > Developer > Add. I know. That’s enough reason to hate PowerPoint. But at least, the good news is you only have to do it once. Once you’ve got that solved you may proceed with the next steps:Go to the Developer Tab and select Visual Basic.From the Insert menu choose Module.Add the code from the next section in the newly created module and save the file. You’ll probably be notified that the .PPTX extension does not support macros. No worries. Just click No, select .PPTM (Macro enabled presentation) as your file type and save. Make sure you customize the code first to provide the prompt with more context on your specific presentation (see the comments in the code).And that’s it. You’re good to go.Now, select Developer > Macros and you should see a new macro called QueryGpt.Just go to your desired slide and run the macro. Wait for a few seconds and you should have your notes populated with AI’s interpretation of your slide. Be mindful though, the code is simple. There’s no error checking and I haven’t tested it on hundreds of PPTs.You can download the .BAS source file for this article from here. Pro Tip: If you want to create an easy shortcut to your macro you can click on the last icon in the very top ribbon menu and select More commands and then from Choose commands from select Macros, choose the QueryGpt macro and Add. A new button for the macro will be permanently added on your ribbon.The codeNo more talking. Here’s the macro:Sub QueryGpt() Dim sld As Slide Dim shp As Shape Dim text As String Dim explanation As String Dim prompt As String Dim subject As String ' Extract and format the text from the current slide Set sld = ActiveWindow.View.Slide For Each shp In sld.Shapes If shp.HasTextFrame Then If shp.TextFrame.HasText Then text = text & " [" & Replace(shp.TextFrame.TextRange.text, vbCrLf, "/") & "] " End If End If Next shp text = Replace(Replace(Replace(text, vbCrLf, "\n"), vbCr, "\r"), vbLf, "\n") ' Prepare the prompt and call the function that queries the AI model subject = "The concept of Public Domain" prompt = "You are a useful assistant that explains the concepts from a presentation about " prompt = prompt & subject & ". Explain the following concepts: " explanation = CallOpenAI(prompt & text) ' Append the AI explanation to the slide notes sld.NotesPage.Shapes.Placeholders(2).TextFrame.TextRange.InsertAfter vbCrLf & explanation End Sub Take note of the variable called subject.That is the main topic of your presentation, and you should change that to the subject of your slides.That way the AI model gets more context about your slides and can provide better descriptions in the notes.You should be able to achieve better results by adding more details in that variable but consider the maximum context of the model (for GPT 3.5 Turbo that is around 3000 English words). The more you add to your initial input, the more you limit your result’s maximum length because the whole context cannot exceed ~3000 words.Next, here is the function responsible for making the API call and formatting the resulting answer:Function CallOpenAI(text As String) As String Dim httpRequest As Object Dim responseText As String Dim contentStartPos As Long Dim contentEndPos As Long ' Prepare the connection and send the request to the OpenAI API Set httpRequest = CreateObject("WinHttp.WinHttpRequest.5.1") httpRequest.Open "POST", "https://api.openai.com/v1/chat/completions", False httpRequest.setTimeouts 0, 60000, 30000, 120000 httpRequest.setRequestHeader "Content-Type", "application/json" httpRequest.setRequestHeader "Authorization", "Bearer " & Environ("OPENAI_API_KEY") httpRequest.Send "{""model"": ""gpt-3.5-turbo"", " _ & """messages"": [{""role"": ""user"", " _ & """content"": """ & text & """}]}" httpRequest.WaitForResponse responseText = httpRequest.responseText ' Extract the AI answer from the response string contentStartPos = InStr(1, responseText, """content"":""") + 11 responseText = Mid(responseText, contentStartPos) contentEndPos = InStr(1, responseText, """") - 1 responseText = Replace(Mid(responseText, 1, contentEndPos), "\n", vbCrLf) CallOpenAI = responseText End Function All you have to do to add this code to the presentation is to follow steps 3 to 5 above.Once you’ve created the module and saved the presentation, you’ll have the new macro called QueryGpt.If you’ve created the shortcut by following the pro tip, you’ll also have a neat way of calling the macro whenever you need to append some slide notes to the current slide.And there you go: Old-school PowerPoint presentations infused with AI super-powers.Feels like converting an old Diesel to an EV, doesn’t it? DisclaimerAs with any AI generated content do not forget to check the outputs for accuracy and potential hallucinations. (https://www.packtpub.com/page/4-ways-to-treat-a-hallucinating-AI-with-Prompt-Engineering_). Do not forget that any bias or inaccurate data in the slide content may get amplified in the outputs. Author BioAndrei Gheorghiu is an experienced trainer with a passion for helping learners achieve their maximum potential. He always strives to bring a high level of expertise and empathy to his teaching.With a background in IT audit, information security, and IT service management, Andrei has delivered training to over 10,000 students across different industries and countries. He is also a Certified Information Systems Security Professional and Certified Information Systems Auditor, with a keen interest in digital domains like Security Management and Artificial Intelligence.In his free time, Andrei enjoys trail running, photography, video editing and exploring the latest developments in technology.You can connect with Andrei on:LinkedIn: https://www.linkedin.com/in/gheorghiu/Twitter: https://twitter.com/aqg8017  
Read more
  • 0
  • 0
  • 1430
Modal Close icon
Modal Close icon