





















































The spooky season is here—candies, costumes, and fear is everywhere.
But the real nightmare? It's not ghosts—it's job loss
Over 71% of people believe AI will take their jobs by 2025. The anxiety is real: Are you good enough? Fast enough? Smart enough?
But here's your treat this Halloween—a real solution to end the fear
Join the online 2-Day LIVE AI MASTERMIND by Outskill - a hands-on intensive training designed to make you an AI powered professional who can learn, earn and Build with AI.
Usually $395, but as a part of their halloween sale 🎃, you can get in for completely FREE!
Rated 9.8/10 by Trustpilot– an opportunity that makes you an AI Generalist who can build, solve & work on anything with AI, instead of fearing it.
In just 16 hours & 5 sessions, you will:
✅ Build AI agents that save up to 20+ hours weekly and turn time into money
✅ Master 10+ AI tools that professionals charge $150/hour to implement
✅ Launch your $10K+ AI consulting business in 90 days or less
✅ Automate 80% of your workload and scale your income without working more hours
Learn strategies used by the biggest giants like Google, Amazon, Microsoft from their practitioners 🚀🔥
🧠Live sessions- Saturday and Sunday
🕜10 AM EST to 7 PM EST
🎁 You will also unlock $5000+ in AI bonuses: prompt bibles 📚, roadmap to monetize AI 💰 and your personalised AI toolkit builder ⚙️️ — all free when you attend!
Sponsored
Welcome to BIPro Expert Insights #116!
Behind the Book: Eric Narro and Getting Started with Taipy
In this week’s Expert Edition, we’re excited to feature Eric Narro, Analytics Engineer and author of Getting Started with Taipy. Eric brings a refreshing perspective on how to move from time series to chatbots—and how to bring your Python models to life with Taipy, a tool purpose-built for taking data apps from prototype to production.
For those unfamiliar, Taipy is a pure-Python application builder designed to help you deploy scalable, interactive data applications in real production environments. It’s ideal for turning your analytics, models, and algorithms into end-user experiences, whether dashboards, optimization tools, or AI-powered chatbots.
This week, Eric not only shares his technical insights in the article From Time Series to Chatbots: Bring Your Python Models to Life with Taipy, but also takes us behind the scenes of his author journey in How I Got to Write a Book with Packt, a personal story about curiosity, persistence, and how a chance encounter at PyCon France sparked a book deal.
And here’s a bonus: for one week only, you can grab Getting Started with Taipy at 30% off (ebook) and 10% off (print), the perfect time to dive in and start building production-ready Python applications.
Let’s explore both the tech and the story that made it possible.
Cheers,
Merlyn Shelley
Growth Lead, Packt
From Time Series to Chatbots: Bring your Python Models to Life with Taipy
Taipy is a Python application builder with one clear promise:deploy your data applications in real production environments. It’s the ideal tool for creating scalable, interactive apps that bring your models, analytics, and algorithms to life. Whether you’re building dashboards, optimization tools, or AI-powered chatbots, Taipy helps data professionals turn prototypes into powerful, end-user applications. WithGetting Started with Taipy, you’ll learn how to build complete applications from the ground up, deploy them confidently, and explore real-world examples and advanced use cases that showcase Taipy’s full potential.
Python has long been the go-to language for data professionals, not because they’re developers, butbecause Python makes complex work accessible.Analysts, data scientists, and AI engineers use it to model data, run analytics, and visualize results.
But when it comes to turning those models into real applications for end users, things get tricky. Building a web app the traditional way, with backend frameworks, databases, and front-end stacks, is often out of reach for data teams. It demands skills, time, and coordination that slow everything down and increase costs.
Tools like Power BI or Tableau help visualize data, but they can’t trulyrunPython code or offer the flexibility of a full application. Python frameworks like Streamlit, Dash, Panel, or Gradio solve the problem partially. Each has trade-offs. To give an example, Streamlit is a great library for prototyping: it’s very easy to learn, and you can create demos in no time. While you can take Streamlit applications to production, they are harder to scale because they don’t optimize the way code runs, and they run on their own server (you can’t run them in a WSGI server). What this means is you can create useful applications for end users if they make limited use of the app, or if you don’t need to process large amounts of data.
That’s where Taipy comes in!
Taipy lets you create scalable, production-grade applications directly in Python.Whether for time series, optimization, geospatial analysis, or even LLM chatbots, Taipy is designed for performance and scalability.You can deploy Taipy apps on WSGI servers, handle multiple users efficiently, and still build everything using pure Python.
Continue reading the full article on our Packt Medium Handle here.
Would you have told me, 4 years ago: “Eric, you’re going to write a book about computer science or data topics, and you’ll actually be apublished author,” I’d have laughed and said, “Come on, stop lying to me!”
But here we are.
In this article, I want to sharehow I ended up writing a book: the story behind the opportunity, the twists and turns that led me there, and the lessons I picked up along the way. In a follow-up post, I’ll dive deeper into what it was really like to go through the writing and publishing process.
The motivation for this article is to eventually motivate anyone reading this to take action and work hard towards their goals, whatever they are. A second motivation is to show how working towards your goals may end up giving you unexpected results (I never thought of writing a book before I was asked to do it!)
I’m 38 as I write these lines. I guess any story behind any personal outcome could be traced back to birth, but don’t worry, I won’t torture you with a detailed overview of my past! Still, there’s a wholewaterfall of eventsthat led me here, and that’s what I want to write about.
How I Became a Data Analyst
For a living, I have a job. I’m a data analyst. Although, to be honest, I actually do more of a data engineering role these days (with ETL tasks, integrating data into databases, and so on, it’s quite diversified).
I’ve written before aboutthe path I took to becoming a data analyst, but to summarize: I was a vineyard technician for 8 years, I learned Python to build my own tools, eventually I learned programming more extensively in college with distance studies and through a number of Coursera courses, and I effectively changed careers after sometime programming both at my work, or by doing personal projects.
It took me a long time to take the step of changing careers, in part because I started to learn programming (and program effectively) as a way to improve and automate tasks at my former job (which I didn’t dislike); also, at some point, there was some lack of confidence to make the switch. But over time, I realized that I loved programming and working with data even more than being a vineyard technician, and that gave me the push I needed.
Before officially changing careers, I had already learnedversion control, built a small GitHubportfolio(with README files and documentation about them). I also gained a foundation inSQL. I dabbled in web development withPHPandMySQL. I knew how to deal withLinux systems. During college, I had programmed inC, Bash, LISP, JavaScript, and Prolog, and I knew my way around Boolean logic, encoding, and binary calculations. I also knew some about statistics, analytical workflows, and data warehousing.
The reason I mention all this is: it was quite odd that I became a data analyst in the first place;I found that passion along the way. But at the same time, it wasn’t a miracle either;it was the result of hard work, and ultimately, it was the result of other people giving me a chance, based on what I was able to share with them. That is why it’s important tojust do things. Efforts will end up paying in one way or another. Byjust doing things, you’ll eventually end up doing too many things, and then you’ll have to choose which one… but until then, just start doing something you like!
And well, I also mention this because there’s no way I would have found out about Taipy, which is a Python library, had I not been proficient with Python!
Continue reading the full article on our Packt Medium Handle here.
See you next time!