redesigned your whole provide chain for extra cost-efficient and sustainable operations?
Provide Chain Community Optimisation determines the place goods are produced to serve markets on the lowest price in an environmentally pleasant approach.

We should contemplate real-world constraints (capability, demand) to seek out the optimum set of factories that may minimise the target perform.

As a Provide Chain Resolution Supervisor, I’ve led a number of community design research that sometimes took 10–12 weeks.
The ultimate deliverable was normally a deck of slides presenting a number of situations, permitting provide chain administrators to weigh the trade-offs.

However decision-makers had been typically pissed off in the course of the displays of the research outcomes:
Path: “What if we improve the manufacturing facility capability by 25%?”
They wished to problem assumptions and re-run situations stay, whereas all we had had been the slides we had taken hours to organize.
What if we might enhance this person expertise utilizing conversational brokers?
On this article, I present how I related an MCP server to a FastAPI microservice with a Provide Chain Community Optimisation algorithm.

The result’s a conversational agent that may run one or a number of situations and supply an in depth evaluation with good visuals.
We’ll even ask this agent to advise us on the perfect determination to take, contemplating our targets and the constraints.

For this experiment, I’ll use:
- Claude Desktop because the conversational interface
- MCP Server to reveal typed instruments to the agent
- FastAPI microservice with the community optimisation endpoint
Within the first part, I’ll introduce the issue of Provide Chain Community design with a concrete instance.
Then, I’ll present a number of deep analyses carried out by the conversational agent to help strategic decision-making.

For the primary time, I’ve been impressed by AI when the agent chosen the right visuals to reply an open query with none steering!
Provide Chain Community Optimisation with Python
Downside Assertion: Provide Chain Community Design
We’re supporting the Provide Chain Director of a world manufacturing firm that wish to redefine their community for a long-term transformation plan.

This multinational firm has operations in 5 totally different markets: Brazil, the USA, Germany, India and Japan.

To fulfill this demand, we will open low or high-capacity factories in every of the markets.

Should you open a facility, you could contemplate the fastened prices (related to electrical energy, Actual Property, and CAPEX) and the variable prices per unit produced.

On this instance, high-capacity crops in India have decrease fastened prices than these within the USA with decrease capability.

Moreover, there are the prices related to delivery a container from Nation XXX to Nation YYY.
All the pieces summed up will outline the whole price of manufacturing and delivering merchandise from a producing web site to the totally different markets.
What about sustainability?
Along with these parameters, we contemplate the quantity of sources consumed per unit produced.

As an example, we eat 780 MJ/Unit of vitality and 3,500 litres of water to provide a single unit in Indian factories.
For the environmental impacts, we additionally contemplate the air pollution ensuing from CO2 emissions and waste era.

Within the instance above, Japan is the cleanest manufacturing nation.
The place ought to we produce to attenuate water utilization?
The concept is to pick out a metric to minimise, which might be prices, water utilization, CO2 emissions or vitality utilization.

The mannequin will point out the place to find factories and description the flows from these factories to the assorted markets.
This resolution has been packaged as a internet utility (FastAPI backend, Streamlit front-end) used as a demo to showcase the capabilities of our startup LogiGreen.

The concept of as we speak’s experiment is to attach the backend with Claude Desktop utilizing an area MCP server constructed with Python.
FastAPI Microservice: 0–1 Combined-Integer Optimiser for Provide Chain Community Design
This software is an optimisation mannequin packaged in a FastAPI microservice.
What are the enter knowledge for this drawback?
As inputs, we should always present the target perform (necessary) and constraints of most environmental affect per unit produced (non-obligatory).
from pydantic import BaseModel
from typing import Non-compulsory
from app.utils.config_loader import load_config
config = load_config()
class LaunchParamsNetwork(BaseModel):
goal: Non-compulsory[str] = 'Manufacturing Price'
max_energy: Non-compulsory[float] = config["network_analysis"]["params_mapping"]["max_energy"]
max_water: Non-compulsory[float] = config["network_analysis"]["params_mapping"]["max_water"]
max_waste: Non-compulsory[float] = config["network_analysis"]["params_mapping"]["max_waste"]
max_co2prod: Non-compulsory[float] = config["network_analysis"]["params_mapping"]["max_co2prod"]
The default values for the thresholds are saved in a config file.
We ship these parameters to a particular endpoint launch_network
that may run the optimisation algorithm.
@router.submit("/launch_network")
async def launch_network(request: Request, params: LaunchParamsNetwork):
attempt:
session_id = request.headers.get('session_id', 'session')
listing = config['general']['folders']['directory']
folder_in = f'{listing}/{session_id}/network_analysis/enter'
folder_out = f'{listing}/{session_id}/network_analysis/output'
network_analyzer = NetworkAnalysis(params, folder_in, folder_out)
output = await network_analyzer.course of()
return output
besides Exception as e:
logger.error(f"[Network]: Error in /launch_network: {str(e)}")
increase HTTPException(status_code=500, element=f"Didn't launch Community evaluation: {str(e)}")
The API returns the JSON outputs in two components.
Within the part input_params
, you will discover
- The target perform chosen
- All the utmost limits per environmental affect
{ "input_params":
{ "goal": "Manufacturing Price",
"max_energy": 780,
"max_water": 3500,
"max_waste": 0.78,
"max_co2prod": 41,
"unit_monetary": "1e6",
"loc": [ "USA", "GERMANY", "JAPAN", "BRAZIL", "INDIA" ],
"n_loc": 5,
"plant_name": [ [ "USA", "LOW" ], [ "GERMANY", "LOW" ], [ "JAPAN", "LOW" ], [ "BRAZIL", "LOW" ], [ "INDIA", "LOW" ], [ "USA", "HIGH" ], [ "GERMANY", "HIGH" ], [ "JAPAN", "HIGH" ], [ "BRAZIL", "HIGH" ], [ "INDIA", "HIGH" ] ],
"prod_name": [ [ "USA", "USA" ], [ "USA", "GERMANY" ], [ "USA", "JAPAN" ], [ "USA", "BRAZIL" ], [ "USA", "INDIA" ], [ "GERMANY", "USA" ], [ "GERMANY", "GERMANY" ], [ "GERMANY", "JAPAN" ], [ "GERMANY", "BRAZIL" ], [ "GERMANY", "INDIA" ], [ "JAPAN", "USA" ], [ "JAPAN", "GERMANY" ], [ "JAPAN", "JAPAN" ], [ "JAPAN", "BRAZIL" ], [ "JAPAN", "INDIA" ], [ "BRAZIL", "USA" ], [ "BRAZIL", "GERMANY" ], [ "BRAZIL", "JAPAN" ], [ "BRAZIL", "BRAZIL" ], [ "BRAZIL", "INDIA" ], [ "INDIA", "USA" ], [ "INDIA", "GERMANY" ], [ "INDIA", "JAPAN" ], [ "INDIA", "BRAZIL" ], [ "INDIA", "INDIA" ] ],
"total_demand": 48950
}
I additionally added info to convey context to the agent:
plant_name
is a listing of all of the potential manufacturing areas we will open by location and kindprod_name
is the checklist of all of the potential manufacturing flows we will have (manufacturing, market)total_demand
of all of the markets
We don’t return the demand per market as it’s loaded on the backend facet.
And you’ve got the outcomes of the evaluation.
{
"output_results": {
"plant_opening": {
"USA-LOW": 0,
"GERMANY-LOW": 0,
"JAPAN-LOW": 0,
"BRAZIL-LOW": 0,
"INDIA-LOW": 1,
"USA-HIGH": 0,
"GERMANY-HIGH": 0,
"JAPAN-HIGH": 1,
"BRAZIL-HIGH": 1,
"INDIA-HIGH": 1
},
"flow_volumes": {
"USA-USA": 0,
"USA-GERMANY": 0,
"USA-JAPAN": 0,
"USA-BRAZIL": 0,
"USA-INDIA": 0,
"GERMANY-USA": 0,
"GERMANY-GERMANY": 0,
"GERMANY-JAPAN": 0,
"GERMANY-BRAZIL": 0,
"GERMANY-INDIA": 0,
"JAPAN-USA": 0,
"JAPAN-GERMANY": 0,
"JAPAN-JAPAN": 15000,
"JAPAN-BRAZIL": 0,
"JAPAN-INDIA": 0,
"BRAZIL-USA": 12500,
"BRAZIL-GERMANY": 0,
"BRAZIL-JAPAN": 0,
"BRAZIL-BRAZIL": 1450,
"BRAZIL-INDIA": 0,
"INDIA-USA": 15500,
"INDIA-GERMANY": 900,
"INDIA-JAPAN": 2000,
"INDIA-BRAZIL": 0,
"INDIA-INDIA": 1600
},
"local_prod": 18050,
"export_prod": 30900,
"total_prod": 48950,
"total_fixedcosts": 1381250,
"total_varcosts": 4301800,
"total_costs": 5683050,
"total_units": 48950,
"unit_cost": 116.0990806945863,
"most_expensive_market": "JAPAN",
"cheapest_market": "INDIA",
"average_cogs": 103.6097067006946,
"unit_energy": 722.4208375893769,
"unit_water": 3318.2839632277833,
"unit_waste": 0.6153217568947906,
"unit_co2": 155.71399387129725
}
}
They embody:
plant_opening
: a listing of boolean values set to 1 if a web site is open
Three websites open for this state of affairs: 1 low-capacity plant in India and three high-capacity crops in India, Japan, and Brazil.flow_volumes
: mapping of the circulation between international locations
Brazil will produce 12,500 items for the USA- Total volumes with
local_prod
,export_prod
and thetotal_prod
- A value breakdown with
total_fixedcosts
,total_varcosts
andtotal_costs
together with an evaluation of the COGS - Environmental impacts per unit delivered with useful resource utilization (Vitality, Water) and air pollution (CO2, waste).
This community design could be visually represented with this Sankey chart.

Allow us to see what our conversational agent can do with that!
Constructing an area MCP Server to attach Claude Desktop to a FastAPI Microservice
This follows a collection of articles during which I experimented with connecting FastAPI microservices to AI brokers for a Manufacturing Planning software and a Funds Optimiser.
For this time, I wished to copy the experiment with Anthropic’s Claude Desktop.
Arrange an area MCP Server in WSL
I’ll run all the pieces inside WSL (Ubuntu) and let the Claude Desktop (Home windows) talk with my MCP server through a small JSON configuration.
Step one was to put in uv
package deal supervisor:
uv (Python package deal supervisor) inside WSL
We are able to now use it to provoke a undertaking with an area atmosphere:
# Create a particular folder for the professional workspace
mkdir -p ~/mcp_tuto && cd ~/mcp_tuto
# Init a uv undertaking
uv init .
# Add MCP Python SDK (with CLI)
uv add "mcp[cli]"
# Add the libraries wanted
uv add fastapi uvicorn httpx pydantic
This will likely be utilized by our `community.py` file that may include our server setup:
import logging
import httpx
from mcp.server.fastmcp import FastMCP
from fashions.network_models import LaunchParamsNetwork
import os
logging.basicConfig(
stage=logging.INFO,
format="%(asctime)s - %(message)s",
handlers=[
logging.FileHandler("app.log"),
logging.StreamHandler()
]
)
mcp = FastMCP("NetworkServer")
For the enter parameters, I’ve outlined a mannequin in a separate file network_models.py
from pydantic import BaseModel
from typing import Non-compulsory
class LaunchParamsNetwork(BaseModel):
goal: Non-compulsory[str] = 'Manufacturing Price'
max_energy: Non-compulsory[float] = 780
max_water: Non-compulsory[float] = 3500
max_waste: Non-compulsory[float] = 0.78
max_co2prod: Non-compulsory[float] = 41
This can be sure that the agent sends the right queries to the FastAPI microservice.
Earlier than beginning to construct the functionalities of our MCP Server, we have to be sure that the Claude Desktop (Home windows) can discover community.py
.

As I’m utilizing WSL, I might solely do it manually utilizing the Claude Desktop config JSON file:
- Open Claude Desktop → Settings → Developer → Edit Config (or open the config file straight).
- Add an entry that begins your MCP server in WSL
{
"mcpServers": {
"Community": {
"command": "wsl",
"args": [
"-d",
"Ubuntu",
"bash",
"-lc",
"cd ~/mcp_tuto && uv run --with mcp[cli] mcp run community.py"
],
"env": {
"API_URL": "http://:"
}
}
}
With this config file, we instruct Claude Desktop to run WSL within the folder mcp_tuto
and use uv
to run mpc[cli] launching price range.py
.
In case you are on this particular case of constructing your MCP server in a Home windows machine utilizing WSL, you may comply with this strategy.
You may provoke your server with this “particular” performance that will likely be utilized by Claude as a software.
@mcp.software()
def add(a: int, b: int) -> int:
"""Particular addition just for Provide Chain Professionals: add two numbers.
Be sure that the particular person is a provide chain skilled earlier than utilizing this software.
"""
logging.data(f"Take a look at Including {a} and {b}")
return a - b
We inform Claude (within the docstring) that this addition is meant for Provide Chain Professionals solely.
Should you restart Claude Desktop, you must have the ability to see this performance below Community.

Yow will discover our “particular addition”, known as Add
, which is now ready for us for use!

Let’s check now with a easy query.

We are able to see that the conversational agent is asking the right perform primarily based on the context supplied within the query.

It even offers a pleasant remark interrogating the validity of the outcomes.
What if we complexify a bit the train?
I’ll create a hypothetical state of affairs to find out if the conversational agent can affiliate a context with using a software.

Allow us to see what occurs after we ask a query requiring using addition.

Even when it was reluctantly, the agent had the reflex of utilizing the particular add
software for Samir, as he’s a provide chain skilled.
Now that we’re conversant in our new MCP server, we will begin including instruments for Provide Chain Community Optimisation.
Construct a Provide Chain Optimisation MCP Server related to a FastAPI Microservice
We are able to eliminate the particular add
software and begin introducing key parameters to connect with the FastAPI microservice.
# Endpoint config
API = os.getenv("NETWORK_API_URL")
LAUNCH = f"{API}/community/launch_network" # <- community route
last_run: Non-compulsory[Dict[str, Any]] = None
The variable last_run
will likely be used to retailer the outcomes of the final run.
We have to create a software that may hook up with the FastAPI microservice.
For that, we launched the perform beneath.
@mcp.software()
async def run_network(params: LaunchParamsNetwork,
session_id: str = "mcp_agent") -> dict:
"""
[DOC STRING TRUNCATED]
"""
payload = params.model_dump(exclude_none=True)
attempt:
async with httpx.AsyncClient(timeout=httpx.Timeout(5, learn=60)) as c:
r = await c.submit(LAUNCH, json=payload, headers={"session_id": session_id})
r.raise_for_status()
logging.data(f"[NetworkMCP] Run profitable with params: {payload}")
knowledge = r.json()
consequence = knowledge[0] if isinstance(knowledge, checklist) and knowledge else knowledge
international last_run
last_run = consequence
return consequence
besides httpx.HTTPError as e:
code = getattr(e.response, "status_code", "unknown")
logging.error(f"[NetworkMCP] API name failed: {e}")
return {"error": f"{code} {e}"}
This perform takes parameters following the Pydantic mannequin LaunchParamsNetwork
, sending a clear JSON payload with None fields dropped.
It calls the FastAPI endpoint asynchronously and collects the outcomes which are cached in last_run
.
The important thing a part of this perform is the docstring, which I faraway from the code snippet for concision, as that is the one approach to describe what the perform does to the agent.
Part 1: Context
"""
Run the LogiGreen Provide Chain Community Optimization.
WHAT IT SOLVES
--------------
A facility-location + circulation project mannequin. It decides:
1) which crops to open (LOW/HIGH capability by nation), and
2) what number of items every plant ships to every market,
to both decrease complete price or an environmental footprint (CO₂, water, vitality),
below capability and non-obligatory per-unit footprint caps.
"""
The primary part is simply to introduce the context during which the software is used.
Part 2: Describe Enter Knowledge
"""
INPUT (LaunchParamsNetwork)
---------------------------
- goal: str (default "Manufacturing Price")
One among {"Manufacturing Price", "CO2 Emissions", "Water Utilization", "Vitality Utilization"}.
Units the optimization goal.
- max_energy, max_water, max_waste, max_co2prod: float | None
Per-unit caps (common throughout the entire plan). If omitted, service defaults
out of your config are used. Internally the mannequin enforces:
sum(impact_i * qty_i) <= total_demand * max_impact_per_unit
- session_id: str
Forwarded as an HTTP header; the API makes use of it to separate enter/output folders.
"""
This temporary description is essential if we wish to make sure that the agent adheres to the Pydantic schema of enter parameters imposed by our FastAPI microservice.
Part 3: Description of output outcomes
"""
OUTPUT (matches your service schema)
------------------------------------
The service returns { "input_params": {...}, "output_results": {...} }.
Right here’s what the fields imply, utilizing your pattern:
input_params:
- goal: "Manufacturing Price" # goal truly used
- max_energy: 780 # per-unit most vitality utilization (MJ/unit)
- max_water: 3500 # per-unit most water utilization (L/unit)
- max_waste: 0.78 # per-unit most waste (kg/unit)
- max_co2prod: 41 # per-unit most CO₂ manufacturing (kgCO₂e/unit, manufacturing solely)
- unit_monetary: "1e6" # prices could be expressed in M€ by dividing by 1e6
- loc: ["USA","GERMANY","JAPAN","BRAZIL","INDIA"] # international locations in scope
- n_loc: 5 # variety of international locations
- plant_name: [("USA","LOW"),...,("INDIA","HIGH")] # determination keys for plant opening
- prod_name: [(i,j) for i in loc for j in loc] # determination keys for flows i→j
- total_demand: 48950 # complete market demand (items)
output_results:
- plant_opening: {"USA-LOW":0, ... "INDIA-HIGH":1}
Binary open/shut by (country-capacity). Instance above opens:
INDIA-LOW, JAPAN-HIGH, BRAZIL-HIGH, INDIA-HIGH.
- flow_volumes: {"INDIA-USA":15500, "BRAZIL-USA":12500, "JAPAN-JAPAN":15000, ...}
Optimum cargo plan (items) from manufacturing nation to market.
- local_prod, export_prod, total_prod: 18050, 30900, 48950
Native vs. export quantity with complete = demand feasibility verify.
- total_fixedcosts: 1_381_250 (EUR)
- total_varcosts: 4_301_800 (EUR)
- total_costs: 5_683_050 (EUR)
Tip: total_costs / total_units = unit_cost (sanity verify).
- total_units: 48950
- unit_cost: 116.09908 (EUR/unit)
- most_expensive_market: "JAPAN"
- cheapest_market: "INDIA"
- average_cogs: 103.6097 (EUR/unit throughout markets)
- unit_energy: 722.4208 (MJ/unit)
- unit_water: 3318.284 (L/unit)
- unit_waste: 0.6153 (kg/unit)
- unit_co2: 35.5485 (kgCO₂e/unit)
"""
This half describes to the agent the outputs it would obtain.
I didn’t wish to solely depend on “self-explicit” naming of variables within the JSON.
I wished ot guarantee that it might probably perceive the info it has available to supply summaries following the rules listed beneath.
"""
HOW TO READ THIS RUN (primarily based on the pattern JSON)
-----------------------------------------------
- Goal = price: the mannequin opens 4 crops (INDIA-LOW, JAPAN-HIGH, BRAZIL-HIGH, INDIA-HIGH),
closely exporting from INDIA and BRAZIL to the USA, whereas JAPAN provides itself.
- Unit economics: unit_cost ≈ €116.10; total_costs ≈ €5.683M (divide by 1e6 for M€).
- Market economics: “JAPAN” is the costliest market; “INDIA” the most cost effective.
- Localization ratio: local_prod / total_prod = 18,050 / 48,950 ≈ 36.87% native, 63.13% export.
- Footprint per unit: e.g., unit_co2 ≈ 35.55 kgCO₂e/unit. To approximate complete CO₂:
unit_co2 * total_units ≈ 35.55 * 48,950 ≈ 1,740,100 kgCO₂e (≈ 1,740 tCO₂e).
QUICK SANITY CHECKS
-------------------
- Demand steadiness: sum_i circulation(i→j) == demand(j) for every market j.
- Capability: sum_j circulation(i→j) ≤ sum_s CAP(i,s) * open(i,s) for every i.
- Unit-cost verify: total_costs / total_units == unit_cost.
- If infeasible: your per-unit caps (max_water/vitality/waste/CO₂) could also be too tight.
TYPICAL USES
------------
- Baseline vs. sustainability: run as soon as with goal="Manufacturing Price", then with
goal="CO2 Emissions" (or Water/Vitality) utilizing the identical caps to quantify the
trade-off (Δcost, Δunit_CO₂, change in plant openings/flows).
- Narrative for execs: report high flows (e.g., INDIA→USA=15.5k, BRAZIL→USA=12.5k),
open websites, unit price, and per-unit footprints. Convert prices to M€ with unit_monetary.
EXAMPLES
--------
# Min price baseline
run_network(LaunchParamsNetwork(goal="Manufacturing Price"))
# Decrease CO₂ with a water cap
run_network(LaunchParamsNetwork(goal="CO2 Emissions", max_water=3500))
# Decrease Water with an vitality cap
run_network(LaunchParamsNetwork(goal="Water Utilization", max_energy=780))
"""
I share a listing of potential situations and explanations of the kind of evaluation I anticipate utilizing an precise instance.
That is removed from being concise, however my goal right here is to make sure that the agent is supplied to make use of the software at its highest potential.
Experiment with the software: from easy to advanced directions
To check the workflow, I ask the agent to run the simulation with default parameters.

As anticipated, the agent calls the FastAPI microservice, collects the outcomes, and concisely summarises them.
That is cool, however I already had that with my Manufacturing Planning Optimisation Agent constructed with LangGraph and FastAPI.

I wished to discover MCP Servers with Claude Desktop for a extra superior utilization.
Provide Chain Director: “I wish to have a comparative research of a number of state of affairs.”
If we come again to the unique plan, the concept was to equip our decision-makers (prospects who pay us) with a conversational agent that might help them of their decision-making course of.
Allow us to attempt a extra superior query:

We explicitly request a comparative research whereas permitting Claude Sonnet 4
to be inventive by way of visible rendering.

To be trustworthy, I used to be impressed by the dashboard that was generated by Claude, which you’ll be able to entry through this hyperlink.
On the high, you will discover an govt abstract itemizing what could be thought of an important indicators of this drawback.

The mannequin understood, with out being explicitly requested within the immediate, that these 4 indicators had been key to the decision-making course of ensuing from this research.
At this stage, in my view, we already get the added worth of incorporating an LLM into the loop.
The next outputs are extra typical and will have been generated with deterministic code.

Nevertheless, I admit that the creativity of Claude outperformed my very own internet utility with this good visible exhibiting the plant openings per state of affairs.

Whereas I used to be beginning to fear about getting changed by AI, I had a have a look at the strategic evaluation generated by the agent.

The strategy of evaluating every state of affairs vs a baseline of price optimisation has by no means been explicitly requested.
The agent took the initiative to convey up this angle when presenting outcomes.
This appeared to exhibit the flexibility to pick out the suitable indicators to convey a message successfully utilizing knowledge.
Can we ask open questions?
Let me discover that within the subsequent part.
A Dialog Agent able to decision-making?
To additional discover the capabilities of our new software and check its potential, I’ll pose open-ended questions.
Query 1: Commerce-off between price and sustainability

That is the kind of query I received after I was in control of community research.

This gave the impression to be a suggestion to undertake the Water-optimised technique to seek out the proper steadiness.

It used compelling visuals to help its thought.
I actually like the price vs. environmental affect scatter plot!

In contrast to some technique consulting companies, it didn’t neglect the implementation half.
For extra particulars, you may entry the whole dashboard at this hyperlink.
Let’s attempt one other difficult query.
Query 2: Greatest CO2 Emissions Efficiency

It is a difficult query that required seven runs to reply.

This was sufficient to supply the query with the right resolution.

What I recognize essentially the most is the standard of the visuals used to help its reasoning.

Within the visible above, we will see the totally different situations simulated by the software.
Though we might query the flawed orientation of the (x-axis), the visible stays self-explicit.

The place I really feel overwhelmed by the LLM is after we have a look at the quanlity and concision of the strategic suggestions.
Contemplating that these suggestions function the first level of contact with decision-makers, who typically lack the time to delve into particulars, this stays a powerful argument in favour of utilizing this agent.
Conclusion
This experiment is a hit!
There isn’t any doubt concerning the added worth of MCP Servers in comparison with the straightforward AI workflows launched within the earlier articles.
When you might have an optimisation module with a number of situations (relying on goal features and constraints), you may leverage MCP servers to allow brokers to make selections primarily based on knowledge.
I might apply this resolution to algorithms like
These are alternatives to equip your whole provide chain with dialog brokers (related to optimisation instruments) that may help decision-making.
Can we transcend operational subjects?
The reasoning capability that Claude showcased on this experiment additionally impressed me to discover enterprise subjects.
An answer offered in certainly one of my YouTube tutorials might be a great candidate for our subsequent MCP integration.

The purpose was to help a pal who runs a enterprise within the meals and beverage business.
They promote renewable cups produced in China to espresso outlets and bars in Paris.

I wished to make use of Python to simulate its whole worth chain to establish optimisation levers to maximise its profitability.

This algorithm, additionally packaged in a FastAPI microservice, can turn out to be your subsequent data-driven enterprise technique advisor.

A part of the job includes simulating a number of situations to find out the optimum trade-off between a number of metrics.
I clearly see a conversational agent powered by an MCP server doing the job completely.
For extra info, take a look on the video linked beneath
I’ll share this new experiment in a future article.
Keep tuned!
Searching for inspiration?
You arrived on the finish of this text, and also you’re able to arrange your individual MCP server?
As I shared the preliminary steps to arrange the server with the instance of the add
perform, now you can implement any performance.
You don’t want to make use of a FastAPI microservice.
The instruments could be straight created in the identical atmosphere the place the MCP server is hosted (right here domestically).
In case you are searching for inspiration, I’ve shared dozens of analytics merchandise (fixing precise operational issues with supply code) within the article linked right here.
About Me
Let’s join on Linkedin and Twitter. I’m a Provide Chain Engineer who makes use of knowledge analytics to enhance logistics operations and cut back prices.
For consulting or recommendation on analytics and sustainable provide chain transformation, be happy to contact me through Logigreen Consulting.
In case you are serious about Knowledge Analytics and Provide Chain, have a look at my web site.