Integration with Langflow
Run CUGA in Langflow
Overview
CUGA integrates seamlessly with Langflow, enabling you to build powerful visual AI workflows. This integration allows you to combine CUGA's agentic capabilities with Langflow's intuitive flow-based interface and MCP (Model Context Protocol) server tools.
Langflow provides a visual, drag-and-drop interface for building LLM applications. When combined with CUGA, you can create sophisticated agentic workflows without writing code.
Prerequisites
Before you begin, ensure you have:
- Python 3.12+ with
uvpackage manager - API keys for your chosen LLM provider (OpenAI, Watsonx, etc.)
- Network access to MCP server endpoints
Make sure your Python environment is properly configured and you have the necessary permissions to install packages.
Installation & Setup
Install the Langflow nightly build using uv:
uv pip install langflow-nightly==1.6.5.dev1We recommend using the nightly build for the latest CUGA integration features.
Launch the Langflow server:
uv run langflow runOnce running, Langflow will be accessible in your browser at http://localhost:7860.
When the Langflow UI opens in your browser:
- Click the "New Flow" button or select "Create Blank Flow" from the menu
- This will open a new empty canvas where you can build your CUGA workflow
Starting with a blank flow ensures a clean workspace for building your CUGA integration.
The Digital Sales MCP server is deployed at:
https://digitalsales-mcp.19pc1vtv090u.us-east.codeengine.appdomain.cloud/sseThis server provides tools for account management, contact lookup, and more.
You can test the endpoint accessibility using curl or by visiting it in your browser.
Building Your Workflow
Component Overview
Configuration Steps
In the Langflow UI, drag and drop these components onto your canvas:
- Chat Input component
- MCP Tools component
- CUGA component
- Chat Output component
Select the MCP Tools component and configure:
- Set the MCP Server URL:
https://digitalsales-mcp.19pc1vtv090u.us-east.codeengine.appdomain.cloud/sse - Enable Tool Mode in the component settings
- Connect the MCP Tools output to the CUGA Tools input
Ensure "Enable Tool Mode" is checked, or CUGA won't be able to access the tools.
Choose your preferred LLM provider:
OpenAI Configuration:
Configure the CUGA component directly with your OpenAI API key.
- No additional LLM component needed
- Simply add your API key in CUGA settings
- Supports GPT-4, GPT-3.5-turbo, and other OpenAI models
IBM Watsonx Configuration:
- Add a new LLM component to the canvas
- Configure for Watsonx.ai:
- Set endpoint URL
- Add API key and Project ID
- Select model (e.g.,
openai/gpt-oss-120b)
- In the CUGA component, select "Custom Model"
- Connect the Watsonx LLM component to CUGA
Watsonx integration allows you to use IBM's enterprise LLM platform with CUGA's agentic capabilities.
CUGA supports custom policies through a policy.md file for enhanced control:
## Plan
<Instructions for how CUGA should approach planning tasks>
Example:
- Break down complex queries into subtasks
- Prioritize information gathering before execution
- Consider dependencies between actions
## Answer
<Instructions for how CUGA should format final answers>
Example:
- Provide concise summaries with key findings
- Include relevant data points and metrics
- Cite sources when using MCP tool resultsUpload this file to the CUGA component to customize its behavior.
Policy files allow you to fine-tune CUGA's reasoning and response formatting without modifying code.
Create the complete workflow by connecting:
- Chat Input → CUGA (Message/Input port)
- MCP Tools → CUGA (Tools input)
- CUGA → Chat Output (Output port)
Look for green connection lines between components to confirm successful connections.
Workflow Architecture
The final CUGA workflow in Langflow follows this structure:
[Chat Input] → [CUGA] → [Chat Output]
↑
|
[MCP Tools] (connected to Tools input)
|
[LLM Component] (optional - connected to Language Model input)Data Flow
- Chat Input → CUGA: User messages flow into CUGA
- MCP Tools → CUGA: Tools are made available to CUGA
- LLM → CUGA: (Optional) Custom language model connection
- CUGA → Chat Output: CUGA's responses are displayed to the user
Use Cases
Testing Your Setup
In the Langflow UI, click the "Playground" button, then select "Chat" to open the chat interface.
Type a message in the Chat Input component to initiate the workflow.
Watch as CUGA analyzes the request and uses MCP tools when needed.
You can monitor tool calls in the Langflow execution logs.
Check the Chat Output for CUGA's formatted responses.
Example Test Queries
Try these queries to test your setup:
"Show me all accounts in the system"
"Find contacts for account XYZ"
"What tools do you have available?"
"Get my top account by revenue"If CUGA successfully responds with account data, your integration is working correctly!
Troubleshooting
Additional Resources
Success! You've configured CUGA with Langflow. This setup enables CUGA to dynamically use MCP server tools while maintaining conversational context and goal-oriented behavior.
