101 Data

Unleashing Creativity: A Guide to Auto-Creating Awesome Instagram Posts with Gen-AI

Automated Instagram Posts
Facebook
Reddit
Twitter
LinkedIn
Email

These days, having a web app and a blog is no longer enough to drive people to your website. You need a sound social media strategy. I’m not a big fan of Instagram, but do see the value of it as a lead-gen source. Lazy as I am, I didn’t feel like posting something on Instagram every day, so I set out to build a side app that automated something I would post every day. This post details what was built and how, the code is available on github.

The idea: scraping news and auto-publishing an Instagram post form it

That’s it, I figured if I could write some code to search for news topics, in our case news around lead gen, then I could:

  • Write a synopsis
  • Create a compelling image to fit the text
  • Add the headline to the text
  • Publish it on Instagram with tags etc
  • Schedule it to run every four hours

Easy! All I needed was some api’s (Instagram, openai, google) and a few lines of Python code.

A month later...

Yep, a month of generating and mainly debugging Pyhton code with ChatGPT, figuring out Meta API’s and coming up with all kind of workarounds and its ready: a v1 of Scrape and Bake, essentially it does what I envisioned, albeit with some extra steps.

Scrape and Bake, the steps to auto generate Instagram posts with scraped news headlines

Step 1: Searching Google

This part eas easy and did not take long. I used the Selenium web driver and simply caled Google with a search term. More work was what came after: storing each headline and url as a pair and having ChatGPT pick the best headline. Extra workarounds were needed to ensure no headline was picked twice. In the end I had to store the selected headlines in an archive file (later I’ll move it to a database probably) so it can be ignored the next round.

				
					    message_input = f"Pick the most compelling headline from the following lines. Do not return anything (no added text or quotes) else than the actual best headline:\n\n"

for stored_headline in remaining_headlines:
        message_input += f"{stored_headline['headline']})\n"

    # Replace 'YOUR_CHATGPT_API_KEY' with your actual ChatGPT API key
    api_key = 'sk-XXX'
    api_url = 'https://api.openai.com/v1/chat/completions'

    headers = {
        'Content-Type': 'application/json',
        'Authorization': f'Bearer {api_key}',
    }

    data = {
        'model': 'gpt-3.5-turbo',
        'messages': [{'role': 'system', 'content': 'You are a helpful assistant.'}, {'role': 'user', 'content': message_input}],
    }

    # Make a request to the ChatGPT API
    response = requests.post(api_url, json=data, headers=headers)
    response.raise_for_status()
    generated_text = response.json()['choices'][0]['message']['content']

    # Extract compelling headline from the generated text
    extracted_info = generated_text.strip().replace('The most compelling headline is ', '').replace('"', '')
				
			

ChatGPT kept adding extra info the returned text, typically something like “the most compelling headline is”. I couldn’t get it to not add it (it didn’t always do it, seemed random, so in the end I just look for it in the output and replace it if needed. I then also do second call to OpenAI to generate a synopsis for the most compelling headline.

A word on proxies

When scraping data from websites – which is pretty much what I’m doing here – it’s best to maintain a level of anonymity to prevent being blocked. Proxies allow the scraper to send requests through different IP addresses, making it more challenging for websites to identify and block the scraping activity.

I ended up choosing ProxyScrape because of the diverse type of proxies, api access and what seems like a large set of options for premium proxies and integrations.

Step 2: Creating a compelling Image

Let me first say here that ChatGPT is nothing short of amazing. I could never have written the 100+ lines of Python code alone without going mad. And yet, its not like the whole app was generated by ChatGPT either. I really see it as a collaboration. I had an idea, fed it to ChatGPT, it spit out code and I debugged it or changed it. Sometimes the code would work instantly, most often did I have to do some iterations. Sometimes ChatGPT surprised me with what it didn’t know or what it got wrong. It for example didn’t seem to know how to accurately call its own API’s, which I found odd, I had to turn to stack overflow and the likes to get the right recent api calls.

Where I really struggled with its output was at the step where we put generated text properly on an image. We used several different python libraries but it was hard for ChatGPT to create code that was flexible enough to get the text correctly, in the right size, without running over the size of the image. Eventually we got there though, although its still not great and needs improvement. 

For the image, I use Dall-E, which worked easy enough. Although I had to iterate often on the prompt to get a good result. To change it up I ask it to use the style of an artist that was born on the day the image get s created.

Now I had a decent image, with a catchy headline coming from the news. The final step: getting it on Instagram!

				
					import requests
from openai import OpenAI
from image_utils.save_image_as_png import save_image_as_png

def generate_instagram_image(headline):
    # Set up your OpenAI API key
    openai_api_key = "sk-XXX"
    client = OpenAI(api_key=openai_api_key)

    dall_e_prompt = "A futuristic and realistic photo or painting of a person, inspired by the style of an artist born today. Do not add letters or text. Make the subject stand in a background that is typical for the chosen artist."

    # Generate image using OpenAI DALL-E model
    response = client.images.generate(
        prompt=dall_e_prompt,
        size="1024x1024",
        quality="standard",
        n=1,
    )

    # Get the image URL
    image_url = response.data[0].url

    # Save the image as "instagram_post_image.png"
    save_image_as_png(image_url, "instagram_post_image.png")
				
			

Step 3: Getting it all onto Instagram

This was easily the most work, with the most head-scratching, debugging and reading api documentation. I’m not sure if it’s on purpose, but Meta makes it quite complex to get anything done programatically. ChatGPT was also no help here, it kept suggesting api calls to Instagram and later Meta that were simply incorrect.

Eventually, by reading other blogs and sample code, I figured out that this is the way to get a post up onto Instagram via Python:

Creating the right Meta accounts

Meta pushes all its api’s to one account: a meta developer account. Even if you only want to work with Instagram, you cannot go directly to Instagram API’s.

So, you need the following:

  • Instagram business account, otherwise the API’s won’t work
  • Meta developer account
  • You need a Meta Page, even if you don’t want to do anything with Facebook. The page is the main object you will be sending the api calls to.
Credential and tokens

To be able to make API calls to Meta, you first need to get a token. You can first get a short lived token (short like an hour), which you can then turn into a long lived token (60 days). It took me quite some digging to figure this out, but eventually I got the calls right.

Posting to Meta

You cannot send a picture directly to Instagram. Instead, you first need to upload the image to a hosting service (I use Imgur) so that you get back a url. This url you can then use in the call to the Facebook graph API that meta exposes.

But, this has to happen in two steps:

  1. You create a container with the image in it.
  2. You post the container

Here is the code to create the container:

				
					import requests

def inst_create_container(account_id, access_token, image_url, caption):
    graph_url = 'https://graph.facebook.com/v18.0/'

    url = graph_url + account_id + '/media'
    param = dict()
    param['access_token'] = access_token
    param['caption'] = caption
    param['image_url'] = image_url
    response = requests.post(url, params=param)
    response = response.json()
    return response
				
			

This returns a container id. That allows us to do the final call, which is to post the container to the graph api. After that we see our image on Instagram.

Notice in the call, the account id, this is where I spent hours figuring out what the right ID would be. It took me a while to find out that this has to be the id of the facebook page you created earlier. This seems odd to me, it feels like this should be the id of the Instagram account or even the facebook account id, but I am sure Meta has good reasons for this.

Here is the code to post the container to the graph:

				
					import requests

def publish_container(creation_id, instagram_account_id, access_token):
    graph_url = 'https://graph.facebook.com/v16.0/'
    url = graph_url + instagram_account_id + '/media_publish/'
    param = dict()
    param['access_token'] = access_token
    param['creation_id'] = creation_id
    response = requests.post(url,params=param)
    response = response.json()
    return response
				
			

Final notes and next steps

Well it would have been way less work to simply create a manual post every day, versus coding this beauty for a month. But I learned a ton:

  • ChatGPT is a true game changer, it turns me from a wanna be coder to someone that can code anything.
  • ChatGPT is not a one stop shop to just generate whatever code, it is more a companion. Like having a friend that is a really good programmer and that is willing to help out always.
  • Dall-e is amazing, but creating the right prompts takes time. Needs more work.
  • Meta’s graph API seems incredibly powerful for marketing purposes. Doing this gave me a view into how businesses can track behavior of Meta users across Whatsapp, facebook and instagram.

So what’s next? Scrape and Bake is running, very much as a first version. I will fine tune the logic to pull news, make it more catchy, and will try to generate reels vs just a picture. My goal in the end is to get Instagram viewers that have an interest in lead-gen, to click through to the main 101 data website. Let’s see where we get!