Tasting Coffee, Coding AI. Meet Rafał – A Fullstack Developer
Check out Rafał’s journey to becoming an AI development specialist and discover which niche still hasn’t been addressed by AI. Also, learn how to explore coffee taste with just one sip.
Your passion for new technology started when you built your computer at 13. Tell us more about that experience.
I dreamt about having a better computer than the one I had at home. I wanted to build a computer from scratch, and I managed to assemble most of the parts with some help from my parents. I had to read tech documentation, and that’s how it all started. Years later, I still read the documentation. 🙂
When I had this computer, I got deep into the gaming community. I started creating simple webpages based on CMS platforms like WordPress. Then I began building game servers. Those were times when game creators were more open to the community; you could add lots of mods and run your own servers. You could even earn some money from it.
Then, in technical school, I followed my passion for informatics and explored web development and basic frontend skills. I went to college to study Applied Computer Science, and that’s when programming truly began for me. Studies forced me to learn some older languages like C++, yet I quickly started to appreciate that. Knowing these “clunky” languages made it easier to switch to other technologies and enjoy their simplicity. Obviously, my studies didn’t give me all the knowledge I needed, especially regarding working in a team with Git and CI/CD tools.
How did your and Bright Inventions’ paths cross?
It was 2 years ago when I was working at a software development agency, but my project ended and I was devastated because I enjoyed working with that team. Eventually, Agata (a recruiter) reached out to me on LinkedIn. It wasn’t anything new, but her message stood out because she was very transparent about the company, projects, and clients I could work for. That’s not always the case when you talk to recruiters. Also, at Bright Inventions, I had the chance to work with a tech stack I was keen on, including Node.js and NestJS.
I actually wanted to join Bright Inventions after the initial screening with Agata. I was already super excited about this job opportunity.
You are one of our AI development specialists. What was your research process and how did you gain knowledge about AI?
The entire AI research group at Bright Inventions focused on exploring large language models (LLMs) and frameworks, along with concrete solutions such as building vector databases and using them with embeddings, to tailor AI to specific use cases. This exploration led us to discover LangChain. LangChain has extensive documentation, which has been incredibly helpful.
We also read numerous blog posts, watched various videos, and listened to podcasts, although we found that this content quickly needed to be updated. Therefore, documentation has proven to be the best source of information. While other content was useful, it primarily served a supporting role in our research process.
What is the skillset for a software developer who wants to specialize in AI implementation?
Knowing Python is key, as it leads the way in languages used for AI development. Understanding prompt engineering is also important. You need to read extensively about how models accept prompts, how they analyze them, and how to construct effective prompts.
Additionally, it's crucial to understand what various models can truly offer. Understanding how models work and knowing the various models available on the market is essential for choosing the right one for our use case, especially when we have a strict budget. Fortunately, there are tools like PromptFoo that can help us assess the costs our solution will generate.
What is more effective and affordable for a company that wants to implement generative AI: model training or enhancing prompt engineering?
I recommend prompt engineering. Model training requires more expertise, which increases the budget and resources needed. Additionally, often you have to pay extra for a model that allows for training. Usually, creating a new prompt with a wider context to generate better results is less expensive compared to model training.
It's not only about prompt engineering though. Building vector databases can also be more affordable than training models.
What trends do you see in the implementation of generative AI?
I’ve noticed a need that still hasn’t been fully addressed by generative AI. We still lack models that recognize specific items from pictures or understand voice prompts. Using written prompts is quite common, yet what about other options?
I think the medtech industry will benefit from image recognition models, for example, for medical imaging. AI can support lung cancer diagnostics based on thousands of X-ray images. This is just one of many possible utilizations.
What are the differences between LLMs and SLMs? Are SLMs a serious alternative to LLMs?
Small Language Models (SLMs) are supposed to be trained on less amount of data, therefore, they require fewer computational resources compared to LLMs. [Read more about LLMs, SLMs, and all those AI buzzwords.] Yet, that’s a theory.
We have tested SLMs on laptops for personal use, and their responses to simple questions were usually correct. However, when the questions became more complex, these models made mistakes. Given the high cost of launching SLMs, it’s not an investment that generates the desired returns, especially compared to models offered by OpenAI API. With OpenAI, we don’t need to set up our infrastructure; we only pay for the actual requests we make.
In terms of SLMs, there have been situations where some models were not accessible through providers like AWS and Azure, meaning you have to set up and manage the infrastructure and bear all associated costs.
Therefore, I would consider whether SLMs are the right models, as they might not deliver the expected results.
I know you are a coffee aficionado. What coffee do you drink and what acidity do you prefer?
It's actually fascinating because it's hard to drink the same coffee twice since even filters can change the taste of the coffee.
I enjoy very acidic coffee, but only when it’s brewed with a drip method. I don’t like to drink a straight espresso; for me, espresso serves as a base for milk coffees like cappuccino or flat white. When it comes to black coffee, it has to be brewed with a drip method.
What coffee brand do you recommend?
I support a local business and buy coffee from a roastery in Łódź called Coffeehood. They offer specialty coffee, which means it has been rated as high quality during a coffee cupping.
Are you attending coffee cuppings? Tell us more about those.
Yes! These are coffee tastings where a coffee shop presents eight different types of coffee from various roasteries around the world. The goal is to assess the coffee from many different perspectives. First, you judge the sensory aspects: the smell and aromas you can detect. Then, you take a first sip to evaluate its taste, structure, and how oily or bitter it is. The first sip is crucial but it can be challenging as your taste buds aren’t as clear as they were at the beginning of the event. But exploring new coffee tastes is worth it!