{"id":5334,"date":"2026-04-21T15:45:58","date_gmt":"2026-04-21T14:45:58","guid":{"rendered":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/"},"modified":"2026-04-21T15:46:02","modified_gmt":"2026-04-21T14:46:02","slug":"i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i","status":"publish","type":"post","link":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/","title":{"rendered":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026"},"content":{"rendered":"<div class=\"anp-pro-entry\">\n<p class=\"anp-pro-lead\">The topic <strong>I ran a full LLM on my phone with no internet, and it&#8217;s more useful than I\u2026<\/strong> is currently the subject of lively discussion \u2014 readers and analysts are keeping a close eye on developments.<\/p>\n<p class=\"anp-pro-p\">This is taking place in a dynamic environment: companies\u2019 decisions and competitors\u2019 reactions can quickly change the picture.<\/p>\n<p class=\"anp-pro-p\">We, at XDA, absolutely love local LLMs. That &#8220;we&#8221; didn&#8217;t really include me for the longest time because I was perfectly happy letting cloud-based models do all the heavy lifting. Why wrestle with quantized weights and a fiddly setup when the results would always feel like a downgrade from what a cloud model hands you for free? So, after trying out a local LLM and being disappointed, I let the first impression be my last for a long, long time.<\/p>\n<p class=\"anp-pro-p\">However, seeing how much the other folks at XDA rave about local LLMs made me feel like I was missing something. So, I finally gave one another shot. This time, instead of running a model on my MacBook Air with its embarrassing 8GB of RAM, I decided to run one straight on my phone. It was far more useful than I had any right to expect.<\/p>\n<figure class=\"anp-pro-inline-figure\" style=\"margin:1.75em auto;text-align:center;max-width:100%\"><img decoding=\"async\" class=\"anp-pro-inline-img\" src=\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/screenshot-2026-04-06-at-2-41-12-am-1.png\" alt=\"\" style=\"margin:0 auto;max-width:100%;width:auto;height:auto;object-fit:contain;object-position:center\" loading=\"lazy\"><\/figure>\n<p class=\"anp-pro-p\">The reason why local LLMs haven&#8217;t really appealed to me is more so my fault, and it&#8217;s that I&#8217;ve been trying to run them on hardware that they aren&#8217;t really designed for. The way an LLM works is that when you ask ChatGPT something, your prompt gets sent to a massive model sitting on powerful servers somewhere in a data center.<\/p>\n<p class=\"anp-pro-p\">With a local LLM, an entire model has to fit on your device instead. This includes all of its model&#8217;s trained weights (essentially files containing everything the model has learned) and parameters, crammed into a file small enough for your device&#8217;s memory to handle. Historically, the trade-off has always been quality and speed. However, AI companies have been working overtime to address exactly this, and Google&#8217;s Gemma 4 is one of the best examples of that effort paying off.<\/p>\n<p class=\"anp-pro-p\">Gemma 4 is Google&#8217;s newest family of open-source AI models built on the same architecture of Gemini 3, and it consists of models in four different sizes: E2B and E4B for phones and edge devices, a 26B mixture-of-experts model, and a full 31B dense model. The best part about these models is that Google has intentionally engineered them to squeeze more intelligence out of each parameter. LLM parameters are essentially the settings that control and optimize its output and behavior.<\/p>\n<p class=\"anp-pro-p\">Traditionally, more parameters translate to better results. However, it also meant that you&#8217;d need more hardware to run them. Gemma 4 flips this by getting smarter output from fewer parameters! In simpler terms, this means you&#8217;re getting responses that feel like they&#8217;re coming from a larger model without needing the hardware to run one.<\/p>\n<p class=\"anp-pro-p\">Two of the models from the Gemma 4 family, E2B and E4B, are optimized to run smoothly on devices like your phone or laptop. Given that local models run on your own hardware, they&#8217;re completely free to use, and your data stays entirely on your device. So, if you have a relatively modern phone, you&#8217;ve got no reason not to try it. You have a few options to run Gemma 4 on your phone.<\/p>\n<figure class=\"anp-pro-inline-figure\" style=\"margin:1.75em auto;text-align:center;max-width:100%\"><img decoding=\"async\" class=\"anp-pro-inline-img\" src=\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/google-ai-edge-gallery-mobile-interface-showing-gemma-4-ai-models-and-use-cases.png\" alt=\"\" style=\"margin:0 auto;max-width:100%;width:auto;height:auto;object-fit:contain;object-position:center\" loading=\"lazy\"><\/figure>\n<p class=\"anp-pro-p\">First up, you can use Google&#8217;s AI Edge Gallery app, which is completely free to download and available on both iOS and Android. You can also use Locally AI on iPhone, iPad, and Mac. The app lets you run Llama, DeepSeek, Qwen, and Gemma models locally, and is optimized for Apple Silicon. I&#8217;ve been personally using the Google AI Edge Gallery app, and it&#8217;s been pretty smooth.<\/p>\n<p class=\"anp-pro-p\">Regardless of which route you go, the setup is truly as simple as it gets \u2014 you download the model you&#8217;d like to use offline (in this case, Gemma-4-E2B-it or Gemma-4-E4B-it) and that&#8217;s about it. The former requires 2.5GB to install, whereas the latter requires 3.61GB. I&#8217;ve been using the Gemma-4-E2B-it model on my iPhone 15 Pro Max, and well, I&#8217;m genuinely impressed by how much it can handle.<\/p>\n<p class=\"anp-pro-p\">I might be a writer for publications that are always at the whims of Google, but I&#8217;m not going to lie to you and say I still Google every little question that pops into my head. Like most people, I&#8217;ve been turning to AI instead. The accuracy of the response is another story, but it isn&#8217;t a bad way to get a quick answer by any means. Similarly, like a lot of people, a significant chunk of the tasks I use AI for are pretty surface-level. Think quick text cleanups, drafting an email, or explaining something I don&#8217;t quite get. I&#8217;m also a student, and AI certainly has a massive role in my learning process nowadays. I&#8217;m always asking it to break down complex concepts, quiz me on topics, walk me through problems step by step, or explain something my professor glossed over in five seconds (usually when I&#8217;m in class of said professor).<\/p>\n<p class=\"anp-pro-p\">None of these tasks need flagship AI models like Opus 4.7 and GPT 5.4. More importantly, none of them need a server thousands of miles away to process your request or an internet connection. Gemma 4 is excellent at all of these tasks, and it&#8217;s been what I&#8217;ve been using exclusively for these everyday tasks since I installed it. Gemma 4 also isn&#8217;t limited to just chat. Through the AI Edge Gallery app, you can use Ask Image to identify objects or read text from photos, Audio Scribe for quick offline voice transcriptions, Agent Skills that let the model use tools like Wikipedia and maps, and even a Prompt Lab for fine-tuning how the model responds. And again, all of the above runs entirely on your phone!<\/p>\n<p class=\"anp-pro-p\">Now, I&#8217;m in no way shape or form saying that Gemma 4 is going to replace ChatGPT, Claude, or Gemini for everything. It won&#8217;t. But also, it doesn&#8217;t really need to! It just has to handle the lightweight everyday stuff, and it does that more than well enough. Turns out, a 2.5GB model on your phone with no internet is a lot more useful than you&#8217;d think.<\/p>\n<aside class=\"anp-pro-aside\" aria-label=\"context\">\n<p class=\"anp-pro-kicker\">Why it matters<\/p>\n<p class=\"anp-pro-p\">News like this often changes audience expectations and competitors\u2019 plans.<\/p>\n<p class=\"anp-pro-p\">When one player makes a move, others usually react \u2014 it is worth reading the event in context.<\/p>\n<\/aside>\n<aside class=\"anp-pro-aside\" aria-label=\"outlook\">\n<p class=\"anp-pro-kicker\">What to look out for next<\/p>\n<p class=\"anp-pro-p\">The full picture will become clear in time, but the headline already shows the dynamics of the industry.<\/p>\n<p class=\"anp-pro-p\">Further statements and user reactions will add to the story.<\/p>\n<\/aside>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>The topic I ran a full LLM on my phone with no internet, and it&#8217;s more &hellip; <a title=\"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026\" class=\"hm-read-more\" href=\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\"><span class=\"screen-reader-text\">I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026<\/span>Read more<\/a><\/p>\n","protected":false},"author":0,"featured_media":5335,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[1211,239,245,816,365],"class_list":["post-5334","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-innovate","tag-gemma","tag-google","tag-model","tag-models","tag-phone"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site\" \/>\n<meta property=\"og:description\" content=\"The topic I ran a full LLM on my phone with no internet, and it&#8217;s more &hellip; I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026Read more\" \/>\n<meta property=\"og:url\" content=\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\" \/>\n<meta property=\"og:site_name\" content=\"innovatenews.site\" \/>\n<meta property=\"article:published_time\" content=\"2026-04-21T14:45:58+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-04-21T14:46:02+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1600\" \/>\n\t<meta property=\"og:image:height\" content=\"900\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\"},\"author\":{\"name\":\"\",\"@id\":\"\"},\"headline\":\"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026\",\"datePublished\":\"2026-04-21T14:45:58+00:00\",\"dateModified\":\"2026-04-21T14:46:02+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\"},\"wordCount\":1147,\"image\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg\",\"keywords\":[\"Gemma\",\"Google\",\"Model\",\"Models\",\"Phone\"],\"articleSection\":[\"Innovate\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\",\"url\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\",\"name\":\"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site\",\"isPartOf\":{\"@id\":\"https:\/\/innovatenews.site\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg\",\"datePublished\":\"2026-04-21T14:45:58+00:00\",\"dateModified\":\"2026-04-21T14:46:02+00:00\",\"author\":{\"@id\":\"\"},\"breadcrumb\":{\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage\",\"url\":\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg\",\"contentUrl\":\"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg\",\"width\":1600,\"height\":900},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/innovatenews.site\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/innovatenews.site\/#website\",\"url\":\"https:\/\/innovatenews.site\/\",\"name\":\"innovatenews.site\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/innovatenews.site\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/","og_locale":"en_US","og_type":"article","og_title":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site","og_description":"The topic I ran a full LLM on my phone with no internet, and it&#8217;s more &hellip; I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026Read more","og_url":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/","og_site_name":"innovatenews.site","article_published_time":"2026-04-21T14:45:58+00:00","article_modified_time":"2026-04-21T14:46:02+00:00","og_image":[{"width":1600,"height":900,"url":"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#article","isPartOf":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/"},"author":{"name":"","@id":""},"headline":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026","datePublished":"2026-04-21T14:45:58+00:00","dateModified":"2026-04-21T14:46:02+00:00","mainEntityOfPage":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/"},"wordCount":1147,"image":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage"},"thumbnailUrl":"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg","keywords":["Gemma","Google","Model","Models","Phone"],"articleSection":["Innovate"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/","url":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/","name":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026 - innovatenews.site","isPartOf":{"@id":"https:\/\/innovatenews.site\/#website"},"primaryImageOfPage":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage"},"image":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage"},"thumbnailUrl":"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg","datePublished":"2026-04-21T14:45:58+00:00","dateModified":"2026-04-21T14:46:02+00:00","author":{"@id":""},"breadcrumb":{"@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#primaryimage","url":"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg","contentUrl":"https:\/\/innovatenews.site\/wp-content\/uploads\/2026\/04\/gemma-4-feature-image.jpg","width":1600,"height":900},{"@type":"BreadcrumbList","@id":"https:\/\/innovatenews.site\/index.php\/2026\/04\/21\/i-ran-a-full-llm-on-my-phone-with-no-internet-and-its-more-useful-than-i\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/innovatenews.site\/"},{"@type":"ListItem","position":2,"name":"I ran a full LLM on my phone with no internet, and it&#039;s more useful than I\u2026"}]},{"@type":"WebSite","@id":"https:\/\/innovatenews.site\/#website","url":"https:\/\/innovatenews.site\/","name":"innovatenews.site","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/innovatenews.site\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/posts\/5334","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/comments?post=5334"}],"version-history":[{"count":1,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/posts\/5334\/revisions"}],"predecessor-version":[{"id":5341,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/posts\/5334\/revisions\/5341"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/media\/5335"}],"wp:attachment":[{"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/media?parent=5334"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/categories?post=5334"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/innovatenews.site\/index.php\/wp-json\/wp\/v2\/tags?post=5334"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}