Earlier this month we sent a group of our talented Hux programmers, software developers, and data engineers to PyGotham in New York City to connect with the larger community and talk through the common issues and roadblocks developers face every day.

Hux by Deloitte Digital owes a lot of its software success to the wonderful humans in the Python community. After all, our team lives and breathes Python development. After two full days at PyGotham, Julian Berman, VP of decisioning for Hux by Deloitte Digital, and Hichame El Khalfi, data engineering lead for Hux by Deloitte Digital, are reporting back on some of their biggest takeaways.

Hype vs. reality: Choosing the right solutions for your problems

One resounding conclusion from PyGotham this year is that the right tool for whatever engineering or programming challenge you face is often not the one that is most hyped. Many of the talks and hallway conversations at PyGotham focused on what tools people actually use to solve the problems they’re confronted with as programmers.

This “say no to hype” approach goes for challenges we’re working through as professional programmers and software engineers, as well as for solving problems outside of the office.

From a professional standpoint, Mahmoud Hashemi, principal engineer at SimpleLegal, shared his real-world advice on building effective software to reach the masses and practical use cases from real Python applications. However, part of what we love about PyGotham is hearing about the creative solutions people have dreamed up to solve their own real-world, human problems using the magic of programming and automation. The fantastic keynote by Kojo Idrissa from Django Events Foundation North America dove into the rapid growth of Python use for solving personal problems like automating your accounting for your personal finances, how this shift has fueled our community’s growth, and how the community can maximize new contributors.

We listened to another great talk by Jessica Garson, a developer at Twitter, on how she used Python to solve a notorious problem faced by all New Yorkers: parking in the city. Jessica explained how she used data to take this issue into her own hands—she’s now mastered her daily street parking routine and avoids getting parking tickets.

These discussions reminded us to always take a real-world approach to solving the issues we face on a day to day basis at Hux. While we do have deep learning and other “trendy” techniques at our fingertips—thanks to advancements in AI—we can solve the same challenges faster and using fewer resources with tried and true tools that might not be as hyped or marketed. Remember, just because it’s a hot topic, doesn’t mean it’s the best tool in your arsenal for that specific problem.
 

Reduce your runtime, keep your operational team happy

Hux’s talk focused on real-world recommendations for how to reduce AI pipeline runtime—including platform architecture, continuous integration, which Spark configuration to use, and how to deploy these tools into production. We decided to tackle a frequent issue and provide executable tips for optimizing models on behalf of clients using PyPy, but also touched on the limitations that can be faced with PyPy instead of CPython.

The audience had some excellent, technical follow-up questions after the presentation, including Hichame El Khalfi and Deepshikha Gandhi’s experience with Terraform and infrastructure as code, and how they automated their Hadoop cluster deployments using Ansible modules in Python.
 

You might not need to go “deep”

Deep learning is a type of machine learning where artificial neural networks, algorithms that imitate the workings of the human brain, process and learn from large amounts of data. While deep learning is an important tool for solving a specific set of problems using fewer resources, it’s not necessarily the best solution for every issue.

We heard time and time again that there are often simpler ways to solve problems using techniques that have been around for much longer but might not be as buzzy. Brad Miro, a developer programs engineer at Google, discussed techniques and methods for distributing machine learning training, including data parallelism and model parallelism. He gave the audience an overview of how to build scalable machine learning systems using Python. Following that theme, Gabe Levine and Jonathan Arfa talked about how the easier-to-use algorithms like Gradient Boosted Trees (GBTs) can solve your benchmark problems without the headache and uncertainty that often come with neural networks.

At Hux, machine learning and the extensive set of machine learning tools within the Python ecosystem are fundamental to what we deliver. This refreshing shift in perspective reminds us that we often already have the tools to deliver the right message and content on the right channel at the right time.
 

Intersection of work life, open source life, and real life

A handful of the discussions at PyGotham focused on an incredibly important topic: mental health.

There were discussions about how open source—or publicly accessible software code—can prevent burnout for the Python community, and how we need to work together to better sustain the infrastructure of open source projects. We also heard a powerful talk from Piper Thunstrom, who discussed her experiences transitioning publicly in the New York City Python community. She explained how she has begun to accept her career success in the developer world and move on from her perceived failures of the past.

The conference gave us the opportunity to meet the humans behind the personas we’ve interacted with online and build stronger bonds with the larger Python community. The discussions energized and inspired us to take on programming challenges in new ways, and we can’t wait to get to work.

Learn more about Hux.