Saturday, December 2, 2023

A comprehensive overview of generative AI and LLMs' trends, use cases, and future implications II. - Engineering and development insights

7 weeks (from 4.9. to 22.10.2023) in the world of Large Language Models and Generative AI tools, this time more focused on the engineering side:


Prompt engineering:

Parallel processing in prompt engineering: the skeleton-of-thought technique.

Unlocking reliable generations through Chain-of-Verification - a leap in prompt engineering.

LLMOps: production prompt engineering patterns with Hamilton.

Crafting different types of program simulation prompts - defining the new program simulation prompt framework.

Some kick-ass prompt engineering techniques to boost our LLM models.

And other prompt engineering tips, a neural network how-to, and recent must-reads.


AI Development and Engineering:

The team behind GitHub Copilot shares its lessons from building the app.

Amazon Bedrock for building and scaling generative applications is now generally available.

Experience from building generative AI apps on Amazon Web Services, using Amazon Bedrock and SageMaker.

A guide with 7 steps for mastering LLMs.

Key tools for enhancing Generative AI in Data Lake Houses.

An introduction to loading Large Language models.

Introduction to ML engineering and LLMOps with OpenAI and LangChain.

MLOps and LLM deployment strategies for software engineers.

Modern MLOps platform for Generative AI.

Leveraging the power of LLMs to guide AutoML hyperparameter searches.

LLMs demand Observability-Driven Development.

LLM monitoring and observability — a summary of techniques and approaches.

How to build and benchmark your LLM evals.

A step-by-step guide to selecting and running your own generative model.

Google Research: Outperforming larger language models with less training data and smaller model sizes - distilling step-by-step.

Google Research: Rethinking calibration for in-context learning and prompt engineering.

Apache Kafka as a mission-critical Data Fabric for GenAI.

Training ChatGPT on your own data.

Hugging Face's guide to optimizing LLMs in production.

Hugging Face is becoming the "GitHub" for Large Language Models.

Building microservice for multi-chat backends using Llama and ChatGPT.

Connect GPT models with company data in Microsoft Azure.

Tuning LLMs with MakerSuite.

Fine-tuning LLMs: Parameter Efficient Fine Tuning (PEFT), LoRA and QLoRA.

How to train BERT for masked language modeling tasks.

Extending context length in Large Language Models.

Conversational applications with Large Language Models understanding the sequence of user inputs, prompts, and responses.

Using data lakes and Large Language Models in development.

How to build an LLM from scratch.

LLM output parsing: function calling vs. LangChain.

Enhancing the power of Llama 2: 3 easy methods for improving your Large Language Model.


Keeping LLMs relevant and current - Retrieval Augmented Generation (RAG).

Build and deploy Retrieval Augmented Generative Pipelines with Haystack.

Why your RAG is not reliable in a production environment.


QCon San Francisco: 

Unlocking enterprise value with Large Language Models.

A modern compute stack for scaling large AI, ML, & LLM workloads.

Saturday, November 25, 2023

A comprehensive overview of generative AI and LLMs' trends, use cases, and future implications I. - Business, technology trends and applications

7 weeks (from 4.9. to 22.10.2023) in the world of Large Language Models and Generative AI tools:


AI in Business and Technology Trends:

How OpenAI turned LLMs into a mainstream success.

Oracle outlines a vision for AI and a cloud-driven future.

Enterprise SaaS companies have announced generative AI features, threatening AI startups.

How Generative AI is disrupting data practices.

Data Provenance in the age of Generative AI.

Is ChatGPT going to take data science jobs?

40% of the labour force will be affected by AI in 3 years.

And Gartner says: 

55% of organizations are in piloting or production mode with Generative AI.

CIOs must prioritize their AI ambition and AI-ready scenarios for next 12-24 months.

More than 80% of enterprises will have used Generative AI APIs or deployed Generative AI-enabled applications by 2026.

60% of seller work to be executed by Generative AI technologies within five years.


AI Applications and Use Cases:

Large Language Models in real-world customer experience applications.

Five generative AI use cases companies can implement today.

Five use cases for CFOs using generative AI.

Revolutionizing business automation with generative AI.

Redefining conversational AI with Large Language Models.

Pros and cons of LLMs for bad content moderation.

Generative AI on research papers using the Nougat model.

Document topic extraction with Large Language Models and the Latent Dirichlet Allocation (LDA) algorithm.

Using AI to add vector search to Cassandra in six weeks.

Monday, November 13, 2023

Large Language Models and other AI tools in software development (from 4.9. to 22.10.2023)

7 weeks (from 4.9. to 22.10.2023) in the world of Large Language Models and other AI tools used for software development:

List of five free AI Tools for programmers (Amazon CodeWhisperer, ChatGPT, CodeGeeX, GitHub Copilot, Bugasura).

A more detailed comparison of AI tools for programmers - the same as above, except Bugasura - another tool - Replit - is mentioned.

And here are 5 ChatGPT alternatives for code generation (Tabnine, Kite, Codota, DeepCode, GitHub Copilot).

Comparing ChatGPT with Bard AI - for software development.

GitHub Copilot Chat in open beta - now available for all individuals in Visual Studio and VS Code.

Couchbase has introduced generative AI capabilities for SQL into Database as a Service (Couchbase Capella)

MetaGPT - ChatGPT-powered AI assistant turning text into ChatGPT-based apps.

AI Assistant for IntelliJ-based IDEs update for October 2023.

Meta open-sources code generation LLM code Llama.

A new customization capability in Amazon CodeWhisperer generates even better suggestions (Preview).

Chatting with the GM of CodeWhisperer.

How is GenAI different from other code generators?

Is AI enough to increase your productivity?

The future of AI in software development - trends and innovations.

Reimagining application development with AI - a new paradigm.

The pitfalls of using general AI in software development - a case for a human-centric approach.

The challenges of producing quality code when using AI-based generalistic models.

Applying Large Language Models (LLM) to software requirements - creating a knowledge hub of business logic and copilot for faster development.

Chat with the Oracle DB - leveraging OpenAI models to query the Oracle DB, building a Text-to-SQL tool and testing it on a public dataset.

Leveraging GPT models to transform natural language to SQL queries by training GPT to query with few-shot prompting.

Retro-engineering a database schema with LLama2 - the idea here is to ask each LLM to analyze sample data and provide some insight into what the initial data scheme might look like.

‘Talk’ to Your SQL Database Using LangChain and Azure OpenAI.

AI-Driven microservice automation - use ChatGPT to build a MySQL database model, and add API Logic Server to automate the creation of SQLAlchemy model, react-admin UI, and OpenAPI (Swagger).

Tuesday, September 19, 2023

Combining software development principles and patterns with GRASP

As software development has evolved over the years, developers have formulated best practices, principles, and design patterns to create more robust and maintainable systems. In this article, we will explore the differences between software development principles and design patterns, and then dive into the GRASP principles. We will also discuss how GRASP principles are combining principles and patterns, and how they can help us to decide what to use. 

What is the difference between software development principles and design patterns? They are both essential concepts in software engineering, but they serve different purposes and operate at different levels of abstraction.

Software development principles are essential for creating high-quality software that is efficient, maintainable, and scalable. By following these principles, development teams can reduce costs, speed up development, and create better products that meet user needs. The principles are high-level guidelines or best practices that inform the software development process. They are often broad and language-agnostic, applying to various programming languages and paradigms. Principles are promoting qualities like maintainability, modularity, efficiency, and simplicity.

You should have a very good reason any time you choose not to follow principles.

Software development design patterns help developers create better software by offering efficient, reusable solutions to common problems that arise during software design. Design patterns lead to improved code quality, easier maintainability, and more effective communication among team members. They also promote scalability, and adaptability, and serve as valuable learning tools for developers. They are more concrete and detailed than principles, providing implementation guidelines for specific design challenges. They may be more closely tied to a particular programming paradigm (e.g., object-oriented, functional, etc.).

You should have a very good reason any time you choose to implement a pattern.

One specific set of principles from object design that offers an interesting way how to think about connecting the principles and patterns is GRASP (General Responsibility Assignment Software Patterns). They were described by Craig Larman in his book Applying UML and Patterns (1997).

It addresses specific development challenges and collects proven programming principles of object-oriented design, rather than being just a set of criteria for creating better software (like SOLID).

It is more a collection of best practices answers to frequently encountered coding challenges and serves as a guide for making design decisionsIt consists of nine principles, answering specific questions:

Creator

 - Who creates an object or a new instance of a class?

 - Assign the responsibility of creating an object to a class that is closely related to it. 

Related patterns are Factory Method or Abstract FactoryThese patterns encapsulate the object creation logic, assigning the responsibility of creating objects to a dedicated factory class.
They promote low coupling and high cohesion by keeping related object-creation logic within a single class.

Information Expert

 - What responsibilities can be assigned to an object?

 - Assign responsibility to the class that has the information necessary to fulfill it. 

Helps us to increase cohesion, promotes encapsulation, and promotes maintainability.

Low Coupling

 - How are objects connected to each other? How to support low dependency, low change impact, and increased reuse?

 - Design classes with minimal dependencies on other classes to promote modularity and improve maintainability and improve reuse potential. 

The Adapter is a design pattern that helps to achieve low coupling. It introduces an adapter class that acts as an intermediary between the incompatible interfaces, reducing the coupling between the classes.

Controller

 - How are input events delegated from the UI/API layer to the domain layer, including coordinating a system operation? 

 - Assign the responsibility of handling system events to a class that represents the overall system, a subsystem, or a use case. The controller is defined as the first object beyond the UI layer that receives and coordinates a system operation. This principle helps in managing system complexity by separating UI from business logic.

The related principle is Pure Fabrication. The related design patterns are, for example, Command and Facade, or Model-View-Controller (MVC).

High Cohesion

 - How are the operations of elements functionally related? How to keep objects focused, understandable, and manageable?

 - The responsibilities of a given set of elements should be strongly related and highly focused on a rather specific topic. Breaking programs into classes and subsystems, if correctly done, is an example of activities that increase cohesion. Classes with closely related responsibilities are more understandable, maintainable, and robust.


Polymorphism

 - How to handle alternative elements based on type? How to create pluggable software components?

 - Assign the responsibility of defining a common interface to related classes, allowing them to be used interchangeably. This principle supports reusability and flexibility. Polymorphic operations should be used instead of explicit branching based on type.

You can use the Strategy pattern here. It defines a common interface for the varying algorithms, allowing them to be used interchangeably. Polymorphism is achieved by using the common interface for different implementations.

Indirection

 - How to avoid a direct coupling between two or more elements and increase reuse potential?

 - Introduce an intermediate class to mediate between other classes, thus reducing coupling and promoting flexibility.

When you want to reduce coupling between a group of classes that communicate with each other, you can apply the Mediator pattern. This pattern introduces a mediator class that acts as an intermediary, managing the communication and relationships between the classes. Another related pattern is, for example, the already mentioned Adapter.

Pure Fabrication

 - How to achieve high cohesion and low coupling of problem domain elements?

 - Assign a responsibility to an artificial class, created just for the purpose of achieving High Cohesion and Low CouplingCalled a ‘service’ in domain-driven design, this class does not represent anything from the problem domain but is created to ensure High Cohesion and Low Coupling are achieved.

Protected Variations

 - How to design objects, subsystems, and systems so that variations in these elements do not impact other elements?

 - Design the system in a way that it is stable in the face of changes by encapsulating variations.

Protected Variations help us to achieve the Robustness of our system. 

You can use the Bridge pattern to ensure that changes in one class hierarchy don't affect another. This pattern separates an abstraction from its implementation, protecting the variations by encapsulating them within separate class hierarchies.


Understanding software development principles, design patterns, and GRASP principles is crucial for developers to create maintainable, scalable, and robust software systems. Applying GRASP principles helps in making better design decisions and potentially choosing the right design pattern for specific problems. 

For instance, if you need a way to create objects of different types based on input data, consider the Factory pattern (based on the Creator and Polymorphism principles).

By following these guidelines, developers can improve the overall quality of the code.

Tuesday, August 29, 2023

ChatGPT and ASCII art

Some time back, while experimenting with the ChatGPT service, I decided to try how proficient these language models are in dealing with ASCII art - a form of visual art that uses characters from the ASCII character set to create images and designs.
Presented below are the outputs generated by three distinct versions of the ChatGPT model that were available at that time, all in response to "write Hello World in ASCII art" prompt:

Legacy GPT-3.5:

Default GPT-3.5:


GPT-4:

As you can see, ASCII art presents a unique challenge for language models like ChatGPT. While these models excel at generating human-like text, their inability to effectively comprehend and create ASCII art remains evident.

The inability of ChatGPT models to handle ASCII art is attributed to their design, which is primarily centered around processing and generating text-based data. ASCII art, however, involves a visual and spatial understanding that goes beyond simple language patterns. The models cannot interpret the exact placement, sizing, and arrangement of ASCII characters to create complex visual designs.

The inability to effectively handle ASCII art exemplifies the gap between textual and visual comprehension within these models.

Friday, May 12, 2023

Vzkříšení II.

Sledoval, jak se okolní stavby začaly hroutit, jídelna se začala sesypávat s nimi a nakonec zmizela v hromadě sutin. Celou čtvrtinu základny zachvátily plameny a na oblohu se vzneslo množství dronů, pohybujících se v uskupeních, připomínající splašená hejna ptáků.

Základna měla tvar kříže a v jejím středu se nacházel kosmodrom. Každé rameno kříže tvořily dvě dlouhé plošiny s jeřáby a dalším vybavením. Při bližším pohledu se zřetelný tvar rozplynul a celá scéna připomínala obrovské mraveniště hemžící se nespočtem malých dronů a nepravidelných funkčních struktur.

Jedna strana základny byla vážně poškozená, plameny se mísily se sytě oranžovou září hvězdy, která se rozptylovala v atmosféře. Hvězda, která se na obloze jevila dvakrát větší než Slunce, visela nízko, těsně nad obzorem, a její tvar byl zkreslený refrakcí.

To bylo vše, co na záznamu mohl vidět. Jeho poslední vzpomínka před událostí byla jak usínal ve svém pokoji - mezi událostí a touto vzpomínkou byla devatenáctihodinová mezera. K dispozici byly i další videozáznamy, ale na nich byl zachycen pouze při chůzi po chodbách. Rozhodl se, že si je prohlédne později.

"Probudil jste se po šedesáti osmi hodinách, po tom, co se nám podařilo omezit následky výbuchu a zajistit dostatečné zdroje. Útok na základnu byl jedním z několika souběžných útoků v systému. K dalším incidentům došlo na druhé planetě a v důlních zařízeních ve vnějších oblastech. Ztratili jsme téměř veškeré spojení s hypernetem, zůstaly jen dva malé datové portály. Další připojení dorazí v příštích osmi měsících prostřednictvím nadsvětelných lodí z nejbližšího strongpointu," řekl hlas a dodal: "To je vše, co vám mohu prozatím říci."

Hlas byl jeho jediným zdrojem informací od probuzení. Když se odmlčel, obklopilo ho děsivé ticho a prázdnota. Nebylo nic vidět - jen prázdnota a jeho myšlenky.

Monday, May 8, 2023

Vzkříšení I.

Na tmavě modré hladině oceánu byly velkými vlnami zmítány malé úlomky ledu. Ze vzdálenosti pozorovatele se tyto vlny zdály poměrně bezvýznamné. Mrazivý exteriér osvětlovaly slabé oranžové paprsky světla vycházející ze zakrytého obzoru, skrytého hned za věžovou konstrukcí.

Obloha byla směsicí temně rudé a modré barvy, na níž se až na několik jemných cirrových útvarů téměř nevyskytovaly mraky. Chladné počasí bylo naštěstí od útulné jídelny odděleno průhlednou bariérou. V tuto hodinu v jídelně panoval čilý ruch, protože se lidé z okolních laboratoří shromáždili kolem dlouhého bufetového stolu, na kterém se nacházely dvě dlouhé porce in vitro masa - rybího a hovězího.

Korven se však soustředil především na venkovní scenérii, hleděl na oblohu a oběma rukama svíral hrnek teplého čaje. Byl ztracený v hudbě, která mu hrála v uších, nerozptylován žádnými zprávami z domova, jen tak chvíli odpočíval. Právě dojedl a dál pozoroval výhled, rámovaný staveništěm s četnými mechanickými rameny a jeřáby po obou stranách. Přímo pod sekcí, v níž se nacházela jídelna, kotvil osamělý trimaránový dron.

A pak...