Scheduled Automation - Part Two: Advanced Scheduling
Like many automation software solutions, TexAu is bundled with a suite of scheduling tools:
- Regular scheduling
- Advanced Scheduling
The advanced planning and scheduling feature is the second model we will cover here.
You can schedule in advance complex workflows to automate your business processes without any manual intervention. For example, this setting is pretty helpful for collecting posts on social media, group members, a list of activities (profile or company posts, articles), or job ads (recruiters/human resources).
💡 Please note that the advanced scheduling options are most suited for extracting post content, search filters, or group members' posts at scale. For social media drip campaigns or LinkedIn outreach purposes, using the default scheduler (see Regular Scheduling - Part I) is enough in most cases.
Using the Advanced Scheduler, you can schedule an automation with more granular details.
For this, check the box "Show Advanced Options" to expand this menu:
Here you can set up a single automation or workflow to start and end between a specific date range:
... but also set it to start and end at a specific time of the day:
... or set it to run on a specific day of the week:
... but also months of the year:
Or it can be a combination of all the above:
When you are all set and done, launch the automation. It will start on the following defined schedule in your current timezone.
You can set up your timezone from your account preferences here:
This scheduling model is beneficial if you have to scrape data daily and schedule it in advance at regular intervals.
- a list of Google Maps search queries
- Linkedin & Sales Navigator search filters
- a list of Facebook Groups URLs
- a list of Facebook post URLs
- a list of new group members’ profile URLs
- a list of Instagram hashtag post URLs
- a list of profile post URLs
- and many more...
Let's suppose you want to scrape the last 20 posts from 40 Facebook groups you follow and do this every day.
But obviously, you don't want to scrape all that data simultaneously and spread this processing out during the day. Late time schedules work, of course, but it's better to mimic human browsing during the day.
Here's a list of 40 Facebook groups you want to extract the 20 last posts daily.
Select the automation called "Extract Posts From A Facebook Group":
In the automation settings, click the "Google Sheet URL" icon:
Add your cookies, sheet URL, and map of the two columns:
- Column A: Facebook Group URLs to process, 40 groups = 40 Group URLs.
- Column B: the number of posts we want to scrape, the 20 last posts for each group.
Then at the bottom, check the box "Advanced Scheduling".
This will expand the following scheduling options:
What is that??? Let's take the values below as an example and explain each option, line by line.
You want to scrape the last 20 posts from 40 Facebook groups every day.
To scrape all these posts, you could do it without scheduling it.
But you know that it's easier for social platforms to detect any automated activity that is repetitive and running at regular intervals.
So you decide to hide your tracks further by adding more randomization with delays between each scraping operation.
A little bit of elementary grade level maths:
20 posts from 40 groups every day = 800 new posts to scrape every day.
You also know that social platforms will check if you operate outside working hours. So, for example, indicating that you may automate your account at night could trigger a warning, so it's better to automate your account during office hours.
So you will schedule your automation on working hours only.
For this, you will schedule your automation to run for 7 hours, from 9 am to 4 pm.
You will then calculate the necessary delays you can add in between scraping operations to process these 800 posts in 7 hours. This will give you the actual time or minimum completion time of your automation. Of course, the actual duration of activities will depend on multiple factors (random delay pauses, daily limits, rate-limits). So this is an average project duration: you start with the time you want to run the automation and try to fit scheduled tasks as much as possible within this time frame.
Multiple possibilities here: you could either scrape all at once (not advised here!) or fraction the scraping along the day in numerous steps and add random delay pauses to make the process look more like a human action.
It could also be either every one hour or every 20 minutes, whatever you want.
How to do that????
Suppose we want to process these 800 posts in 7 hours, and trigger an execution every 30 minutes.
- 800/7 ≈ 114 posts per hour ≈ 57 posts every 30 minutes ≈ 20 posts on 3 Facebook Groups
- So let's add 3 Facebook Group URLS each 30 min first and randomize it between 3 to 4 rows each 30 minutes.
- That means each 30 minutes 3 or 4 Facebook Group URLs will be processed, scraping between (3x20)= 60 to (4x20)=80 posts every 30 minutes.
Let's also add more randomization between the scraping above and add 12 to 27 seconds between each 3-4 Facebook Group URLs processed:
Now let's also add a random time interval between each group URL processed.
- scrape 20 posts from group #1 URL
- wait 320 seconds...
- scrape 20 posts from group #2 URL
- wait 520 seconds...
- scrap 20 posts from group #3 URL
- wait 300 seconds...
- scrape 20 posts from group #4 URL
- wait 280 seconds...
Please, note that for advanced scheduling, the two settings "Number of rows to process every day" should be left EMPTY since it corresponds to regular scheduling. Otherwise, this setting will take priority over advanced scheduling. So it's regular scheduling or advanced, not both. Yes, we know it's confusing, and we plan to revamp it later.
There are multiple ways to achieve the same in TexAu.
We used a small sample of rows to send for automation with a shorter delay in the first case. In the second case, we added more rows and compensated with longer pauses between each row processing.
Don't be too anal about it. Just sprinkle some delays here and there. It will make it harder for social networks to identify a pattern. And the more personal and unique the pattern is, the better.
There is no secret magic sauce or rule to follow here. Just experiment and use what suits your use case the best.
Remember that no matter the above settings, the default limits for each social network in your TexAu account settings still apply. So if you add "process 50 rows per day" but the automation limit is 20, processing will cap the input to 20 only. Ex: the "Send a LinkedIn message" automation will still be a maximum of 50 profiles per day.
Here, it's the same thing as with Google Sheets:
Self-explanatory, tick this checkbox if you have a header on your CSV file or Google sheet. TexAu will process the data starting at row index number 2 below the header.
This setting is familiar to Phantombuster users but is often confusing for most people.
Many thought it was a feature to remove duplicate results from the output of the sheet, but it's not.
This setup prevents duplicate input URLs from being processed twice in a automation.
More specifically, it serves the purpose of processing unique profile URLs in your campaigns.
Ex: Avoid sending twice a message or a connection request to the same profile when reprocessing a Google sheet or search filter input when scheduled to run a regular interval.
This setting is helpful when you want to process data on the sheet starting at a particular row index and not from the beginning of it.
Ex: starting at row 150 and below.
Advanced scheduling options in workflows are the same process described above for automation, but accessing them is different.
Here the setting is accessible inside the workflow builder from the upper left menu (Schedule clock icon):
When using Google Sheets or CSV file input, you don't launch an automation and workflows from the "Play" button as direct input.
For this, you have to click the Submit or Execute at the bottom of the Sheets or CSV settings: