Hi, I am new to space but excited to pick up my skills on langchain js to build few ideas into product.
I am trying to follow on with below docs and hands on Quickstart
Appreciate the doc structure and feels like much better then current 0.3v, but still few below issues faced as detailing down below.
- Some nuances on when to use what
- example:
- initchatmodel.invoke() vs create agent.invoke()
- then in params - like when to use model string or directly llm:llm
- when to use use direct model string as ollama:llama3.18b vs “llama3.1:8b“,{modelprovider:”ollama”}
- when to provide tools in array to createAgent vs directly in new ChatOllama() with bind_tools()
I understand this could be flexibility provided or some performance or use cases based optional usage is what I am trying to figure out like under the hood does it help with performance or provider more configurations, etc not sure
2. Ollama - tools - need help here
I understand ollama support tools as well based on Chat models - Docs by LangChain and limited models support tools, we trying to use llama3.1:8b with tools but get error on tool_choice not supported by ollama - trying quick start with ollama to openAI cost and friction to getting APIKEY.
Recommendations: Would suggest a quick start with local setups including models to save cost and billing while learning using ollama, llmstudio or any free llm provider locally/cloud to extend it further there could be steps for complete freshers with setting things up locally since there are lot of moving parts as well easier to expriment to learn without cost burden in learning especially when we still by quicstart don’t know much on limiting tokens.
Please check on below code if I am missing on something,Appreciate help here.
-
Perplexity not supported.
I see perplexity chat model there on langchain but not working for me, need help on if I am doing it right, as well if docs needs update here.
Please refer below code
export default class LangAgent {
constructor() {
this.systemPrompt = systemPrompt
this.checkpointer = new MemorySaver();
this.tools = [getUserLocation, getWeather],
this.checkpointer = this.checkpointer
this.responseFormat = responseFormat
autoBind(this)
}
init = async () => {
this.agent = createAgent({
model:“ollama:llama3.1:8b”,
//TODO: uncomment to test perplexity, and comment ollama
// llm: initChatModel(“sonar”, {modelProvider:“perplexity”}),
prompt: this.systemPrompt,
//TODO comment tools to check working without tools
tools:this.tools,
responseFormat: this.responseFormat,
checkpointer: this.checkpointer,
});
}
chat = async (userQuery, config = {
configurable: { thread_id: “1” },
context: { user_id: “1” },
}) => {
return await this.agent.invoke({
messages: [{ role: “user”, content: userQuery }],
}, config)
}
}
As well tried with chatOllama but getting same error - Tool choice is not supported for ChatOllama.
My query is I am not passing tool_choice, just following docs.
As well if we remove tools from create agent it gives below error
Tools not passed
/node_modules/langchain/dist/agents/ReactAgent.cjs:29
const toolClasses = Array.isArray(options.tools) ? options.tools : options.tools.tools;
^
TypeError: Cannot read properties of undefined (reading ‘tools’)
Perplexity model Error
Unsupported { modelProvider: perplexity }.\n\nSupported model providers are: openai, anthropic, azure_openai, cohere, google-vertexai, google-vertexai-web, google-genai, ollama, mistralai, groq, cerebras, bedrock, deepseek, xai, fireworks, together"
Ollama Tool Choice error
Tool choice is not supported for ChatOllama.
at ChatOllama.invocationParams node_modules/@langchain/ollama/dist/chat_models.js:342:19
- isAIMessage used in quickStart guide for v1.0 -
But isAIMessage shows deprecated in the package - @langchainlangchainlangchainlangchainlangchainlangchainlangchainlangchain/core@next
import { isAIMessage, ToolMessage } from "@langchain/core/messages";
These observations and issues are based on someone new to langchain and overall ai world, trying to upskill on gen AI fundamentals (the docs sections on concepts is helpful) through below docs and quickstart.PS: its all based on JS stack and langchain@next npm package.