Posts

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP Part 3: Set up image picking (camera/gallery) and sending the photo to the MCP server. In this post, we would add a feature in our React-Native app to let user take a photo (or pick from gallery), send it to the backend, and display the returned AI caption (for now still placeholder). This is where the “Photo Journal” starts coming alive.

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 8

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 8 Part 8: Add PoML Schema for Your Image Captioning In the previous parts of this series, we built a simple React Native app to allow us to select an image either from camera or the gallery of user’s phone to send it to the MCP server endpoint we built using Node/Express that would query OpenAI AI to get a caption for the image.

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 7

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 7 Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 5 Part 5: Add PoML Schema for Your Image Captioning In the previous parts of this series, we built a simple React Native app to allow us to select an image either from camera or the gallery of user’s phone to send it to the MCP server endpoint we built using Node/Express that would query OpenAI AI to get a caption for the image.

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 6

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 6 From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 5 Part 5: Add PoML Schema for Your Image Captioning In the previous parts of this series, we built a simple React Native app to allow us to select an image either from camera or the gallery of user’s phone to send it to the MCP server endpoint we built using Node/Express that would query OpenAI AI to get a caption for the image.

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 5

  From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 5 Part 5: Add PoML Schema for Your Image Captioning In the previous parts of this series, we built a simple React Native app to allow us to select an image either from camera or the gallery of user’s phone to send it to the MCP server endpoint we built using Node/Express that would query OpenAI AI to get a caption for the image.

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 4

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP: Part 4 Part 4: Add PoML Schema for Your Image Captioning By Bharat Tiwari Quick Context — What Makes a “Real MCP” A real MCP (Model Context Protocol) app differs from a simple API because it: Standardizes how models are described, invoked, and reasoned with — using PoML or structured schemas. Allows composable “capabilities” (captioning, analysis, classification, etc.) that can be dynamically discovered or linked to other MCP nodes. Supports multi-model pipelines , chaining different AI tasks in context. Can expose self-describing interfaces , meaning another agent or tool can auto-discover what the MCP can do (using /capabilities , /describe , etc.).

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP

From Snap to Story: Building an AI Photo Journal with React Native, PoML & MCP Part 3:  Set up image picking (camera/gallery) and sending the photo to the MCP server . In this post, we would add a feature in our React-Native app to let user take a photo (or pick from gallery), send it to the backend, and display the returned AI caption (for now still placeholder). This is where the “Photo Journal” starts coming alive.