Model Context Protocol (MCP) Flashcards
A bite‑size review of the Model Context Protocol — click each card to reveal the answer.
1. Who open‑sourced the Model Context Protocol (MCP) and when?
Anthropic released MCP as an open, vendor‑neutral standard in late 2024.
2. What pain point does MCP primarily address?
It eliminates the need for bespoke connectors between every AI model and every external data source, providing a universal, plug‑and‑play interface.
3. MCP is often compared to which everyday hardware standard?
USB‑C — a single, universal port that connects many devices.
4. Name the three core roles in MCP’s architecture.
• Host – the AI application/platform
• Client – an in‑host adapter that manages one server connection
• Server – a lightweight connector exposing data or tools
5. Which side advertises its capabilities during the handshake?
The MCP Server lists the capabilities (resources, tools, prompts, sampling) it offers.
6. What transport layers does MCP officially support today?
Local stdio streams and remote HTTP/SSE; WebSocket support is on the roadmap.
7. Why did MCP pick JSON‑RPC 2.0 as its message format?
JSON‑RPC is simple, language‑agnostic, and already well‑supported, enabling structured request/response plus notifications.
8. What three‑step sequence establishes an MCP session?
initialize → server responds → initialized notification.
9. Which capability type lets the AI *read* external context?
Resources – read‑only files, database rows, documents, etc.
10. Which capability type empowers the AI to *act* on external systems?
Tools – function‑like actions the model can invoke.
11. How do ‘Prompts’ differ from ‘Tools’ in MCP?
Prompts are reusable template workflows (often user‑triggered), whereas Tools are direct action calls initiated by the model.
12. What does the Sampling capability allow an MCP server to do?
Request an LLM completion from the host, effectively letting the server ‘ask’ the model for help.
13. Give an example of a Resource URI an MCP server might expose.
drive://reports/q1.pdf
from a Google Drive connector.
14. Give an example of a Tool an MCP Slack server could provide.
post_message
to send a Slack notification.
15. How does MCP support low‑latency partial results for long tasks?
Servers can stream chunks through HTTP Server‑Sent Events (SSE) before sending the final result.
16. What security framework underpins MCP authentication?
OAuth 2.0 bearer tokens with fine‑grained scopes.
17. Why does MCP require explicit user consent for most tool calls?
To keep humans in the loop, preventing unintended or destructive actions by the AI.
18. List two common threat vectors MCP’s security model mitigates.
• Token misuse / confused‑deputy attacks via audience‑bound tokens
• Prompt or tool injection through sandboxing, scope limits, and user approval
19. Contrast MCP with ChatGPT plug‑ins in one sentence.
Plug‑ins are proprietary to OpenAI and focus on API calls, while MCP is an open, model‑agnostic standard covering data, tools, prompts, and more.
20. How does MCP complement (rather than compete with) LangChain?
LangChain defines in‑process Python/TS tools; MCP standardizes how any agent discovers and calls such tools across process or network boundaries.
21. Which Google protocol focuses on agent‑to‑agent coordination, not agent‑to‑tool, and therefore pairs well with MCP?
Google’s Agent‑to‑Agent (A2A) protocol.
22. First two steps a Host takes to make a new MCP tool callable by its LLM?
- Call tools/list to fetch the tool’s JSON schema.
- Register that schema as a callable function in the model’s context.
23. When should a server send progress notifications instead of blocking?
For long‑running operations (e.g., bulk database export) where immediate feedback or status updates are helpful.
24. Why is robust logging essential in MCP deployments?
For auditing tool usage, debugging model behavior, and tracking security‑relevant events across client and server.
25. How do Client and Server agree on protocol evolution without breakage?
Date‑stamped version negotiation during initialize; they pick the highest mutually supported spec.
26. Local vs. remote MCP server: give one advantage of each.
• Local (stdio) – keeps sensitive data on‑device, ultra‑low latency.
• Remote (HTTP) – scales out, shares one connector with many agents.
27. Name three high‑profile companies that announced MCP support by 2025.
OpenAI, Google, Microsoft (Azure).
28. What are two early flagship MCP connectors many demos use?
GitHub (code) and Google Drive (files).
29. Biggest UX challenge highlighted for MCP going forward?
Designing granular yet user‑friendly permission and consent flows.
30. Which upcoming transport feature is slated to improve true bidirectional streaming?
Native WebSocket support.
31. Planned enhancement to better handle images, audio, and video?
First‑class multimodal content types and chunked binary transfer.
32. What governance effort is forming to avoid protocol fragmentation?
A neutral MCP working group / standards foundation to steward spec changes.
33. Approximate latency for a local **resources/list** call on a small set?
A few milliseconds (often < 10 ms).
34. Typical latency for a remote **tools/call** that hits an external SaaS API?
Roughly 50–200 ms for quick calls, longer if the API itself is slow.
35. Describe the defense‑in‑depth idea behind MCP’s sandbox model.
Each server is isolated with its own auth scope and directory/API limits, so compromise of one connector doesn’t grant broad system access.
36. How does streaming improve user experience during large resource reads?
Partial content arrives quickly, letting the AI or user start reading while the rest downloads.
37. Key roadmap item to standardize common schemas and aid discovery?
A public MCP Registry with typed schema definitions and connector metadata.
38. Why can adding dozens of MCP servers bloat an LLM’s prompt?
Each server’s tool/function description consumes tokens; too many simultaneously can exceed context length or confuse the model.
39. Operational best practice when upgrading MCP libraries?
Run integration tests to catch version or schema mismatches before deployment.
40. What is a simple analogy summarizing MCP’s value proposition in one line?
MCP is the ‘universal API socket’ that lets any AI model plug into any data or tool securely and consistently.