GenAI Stack for Developers

Originally aired:

About the Session

This presentation will explore the integration of preconfigured Large Language Models (LLMs) like Llama2, GPT-3.5, and GPT-4 into AI projects. We’ll discuss how Ollama simplifies local LLM management, and the role of Neo4j as the default vector database, enhancing AI/ML model accuracy. We’ll delve into the use of Neo4j knowledge graphs for precise GenAI predictions, and the orchestration of LangChain for seamless communication between the LLM, applications, and the database using Docker. So, are you ready to embrace the future of AI with this powerful GenAI stack?

See Highlights

Hear What Attendees Say

PwC

“Once again Saltmarch has knocked it out of the park with interesting speakers, engaging content and challenging ideas. No jetlag fog at all, which counts for how interesting the whole thing was."

Cybersecurity Lead, PwC

Intuit

“Very much looking forward to next year. I will be keeping my eye out for the date so I can make sure I lock it in my calendar."

Software Engineering Specialist, Intuit

GroupOn

“Best conference I have ever been to with lots of insights and information on next generation technologies and those that are the need of the hour."

Software Architect, GroupOn

Hear What Speakers & Sponsors Say

Scott Davis

“Happy to meet everyone who came from near and far. Glad to know you've discovered some great lessons here, and glad you joined us for all the discoveries great and small."

Web Architect & Principal Engineer, Scott Davis

Dr. Venkat Subramaniam

“Wonderful set of conferences, well organized, fantastic speakers, and an amazingly interactive set of audience. Thanks for having me at the events!"

Founder of Agile Developer Inc., Dr. Venkat Subramaniam

Oracle Corp.

“What a buzz! The events have been instrumental in bringing the whole software community together. There has been something for everyone from developers to architects to business to vendors. Thanks everyone!"

Voltaire Yap, Global Events Manager, Oracle Corp.