6 Easy Ways To Find Emails With TexAu
Lastly, a few members asked me how to find emails with TexAu.
So guess what? I decided to do another tutorial...
In TexAu, you can find emails in multiple ways:
- Scraping Google SERPs (similar to Scrapebox)
- Scrape personal emails from Linkedin profiles
- Find the email pattern of a pro email and validate it using Linkedin data (First Name, Last Name, Company Domain)
- Scraping Facebook company pages
- Scraping websites
- Soon: Scrape Instagram bios and profiles, also Youtube email scrapping from the video description.
Today, I will showcase six easy ways and their use cases, pros, and cons.
Ninjas never sleep, do you? Are you ready for this one? LET'S GO!
How to find emails these days? A myriad of tools provide such services, but how does it work?
We can separate them into four categories:
- scraping from public sources and search results
- scraping from social media networks: Facebook company pages, Instagram Bios
- scraping on website domains
- algorithmic: determine all patterns and test their validation
Luckily for you, TexAu can do all the above. So let's see how we can use it for our business.
Let's begin with this automation available in TexAu: "Find Emails Using Domain".
This one works like Hunter.io. Similarly, it's working the same way as tools like Scrapebox by scraping all the emails found on google for a target company domain.
Hunter.io domain search
But in addition, it also provides additional data, like:
- Job Role
- Linkedin Profile
- Twitter Profile
- Email validation
This automation is free as long as it does return emails. You heard it free. Crazy huh? TexAu will only debit four email credits if it finds emails. Empty results attempts won't count toward your email credits. Amazing.
To use it, it's super easy:
Input the company website domain (or email domain if it differs; few big companies use different domains only for internal email use).
Optional: fill Department (IT, Finance, etc... verticals) AND/OR Seniority Level (Manager, Executive, etc...)
To test it and get how many emails it returns, let's just fill the domain only. First, let's try with accenture.com, the famous consulting company:
You can also do this search starting from a CSV or Google Sheet column containing a list of domains. Be mindful of your credit count too.
TexAu finds 100 validated emails in just 3 seconds, with their social media profiles (Linkedin & Twitter). Also, it will find phone numbers if present from public sources. Nice.
It took less than 3 seconds to finish the process. Fast.
This method is the easiest way to find emails in TexAu.
- Super Fast
- Allows Volume, no limit in theory (except your email credits, four credits per successful domain search)
- Good accuracy and validation
- Suited for broad targeting and general marketing email needs (be wary of GDPR/CCPA depending on your country on what is allowed or not)
- Not the most targeted way to outreach people (despite having the department and seniority level filters)
While it's hard to get personal emails from Facebook profiles due to preference settings and privacy concerns, it's super easy to find business emails on Facebook.
This method is particularly suited for local businesses and small shops like restaurants.
First, let's do a search on Facebook and look for, let's say, "Plumbing":
Go to "Pages", then in the location tab, search for a city, let's say "New York":
Then, in the category filter, choose "Local Business or Place":
You'll see a list, mainly composed of professional plumbers or related services:
Let's click on one of the business pages listed here:
These are the details we will scrape, including the emails if present.
Now in TexAu, go to the automation store tab, click Facebook and search for the "Extract Facebook Search" module:
Copy the search URL from Facebook:
Paste it in the "SEARCHURL" field below, and don't forget to add your cookies with TexAu chrome extension:
Once done, click the blue button below, done:
Processing took 31 seconds for 300 pages. Not bad.
TexAu will scrape the first 300 businesses in the listing. Now click "Run Spice Using Result":
We will use these results to "feed" another automation to get what we want: emails and phones.
Now, click the Facebook icon and select "Extract Company Facebook About New" automation in the dropdown:
Chaining modules together like this is an alternative way to build quick workflows. It works like the classic Phantombuster style workflow chaining phantoms. But we recommend instead using TexAu's workflow builder, which is more visual and powerful to build your sequences. Last, the automation scripts above are the same ones you see on the automation store homepage.
You complete the first step, click "Next":
Again, add your cookies with the Chrome extension, then in the "PAGE URL" field, select the "link" variable in the dropdown.
This variable is the company page URL of each business in Facebook search results:
Then click the button "EXTRACT ABOUT PAGE":
Now go to the results:
The automation starts and will take some time to scrape the 300 businesses data within the limit of 15 pages PER HOUR for safety reasons:
Here are the results with emails and phones. Cool, isn't it?
Pros: Great for local lead generation and business that uses Facebook like restaurants, services in general.
Cons: Slow at the moment due to the 15-page scraping per hour, but the team is working on scraping the public pages to accelerate the process (but you won't have emails when logged off Facebook, you can crawl the site instead for this).
Another source of emails is Linkedin. Here we will scrape the personal emails from it.
First, a few considerations in mind:
Linkedin unrolled an update last year that turned users' registration email OFF by default. Linkedin users have to enable it in their account to show it. So many forget about it and didn't enable it back.
What does that mean? That your chances to scrape those emails are lower than before.
To do this, let's take the example of our last tutorial and start from a Linkedin event page to scrape the personal emails of all attendees.
We will use the "Find Email Using Linkedin Profile" automation at the end of our flow:
And we will build our workflow like this:
- STEP1: Extract Attendees from an Event (1000 max)
- STEP2: Scrape their profile page to get the most details from it
- STEP3: Scrape their personal emails
STEP1: Select the automation called: "Extract Attendees from a Linkedin Event":
Add your cookie, and the Linkedin event URL. For testing purposes, you can add a smaller number like 10 or 20 maximum attendees. Otherwise, leave that last field empty.
STEP2: Fire up a new automation module to chain to the first one clicking the blue (+) button below the first module:
Add the module called "Scrape a Linkedin Profile":
Same as before, add your cookie, and this time, click the small (+) blue circle next to the "Profile URL" field and add the "url" variable from the previous automation.
We will pass the Linkedin profile URL to that automation. Additionally, you can check the boxes if you have a premium account. You can also check the box to find the pro email too. But leave that for later. I'll show you another way to do so.
STEP3: Fire up a new module to chain to the first one by clicking the blue (+) button below the second module.
Add the automation called "Find Email using Linkedin profile":
Same as before again, add your cookie, but this time we will add the "profileUrl" (Url of the Linkedin profile) and "JobCompanyUrl1" (current company website domain) as variables.
Then launch the automation and go to the results:
It took 37 min (not TexAu CPU time) to process in reasonable limits the first 20 profiles:
So here come your Linkedin personal emails, excellent!
Pros: It works (providing prospect enabled it).
Cons: You are using personal emails, which can cause legal issues if you contact the prospect directly (GDPR in EU, CCP in the US). Use with caution.
Probably my favorite method of all in B2B, on par with tools like Dropcontact and Hunter:
finding the professional emails pattern from Linkedin Salesnavigator data.
The huge advantage of this method is the level of targeting and accuracy.
Using this will cost you one email credit per email found but is the most accurate way to find and validate a pro email.
How does it work?
- It will scrape all public emails found on the internet for a company domain.
- Identify all the patterns + test all the combinations possible
- Validate them
For this reason, this way of finding emails is also pretty resource-intensive and costly.
In this example, I will use a Sales Navigator account to make a super-duper hyper-targeted search filter.
I encourage most of you to buy Salesnavigator because it's the plan that has the highest limits for automation but also returns the most data from its advanced filters.
Compare this to a restaurant that gives you a free meal (regular Linkedin): They take your email, phone number and track you in exchange for a free meal. You eat every day there for free, and you never pay. Then one day, you automate and eat data like crazy and still never pay. But when you pay, the owner becomes suddenly more tolerant of your data appetite. Well, that's Linkedin Sales Navigator in a nutshell. Hence the higher limits, but not unlimited. Nothing is free in this world, sadly.
Let's go back to our Sales Navigator search. I decided to search for IT operations profiles in big companies. Here I want to narrow it down to all executives and the staff under that reports to them.
- IT Analysts
When doing so, the key is also to enumerate ALL the variations of the same job names because the same jobs can have different names or spelling, like:
- CIO= Chief Information Officer
- CTO= Chief Technology Officer
- IT Analyst= Production Analyst, Infrastructure Analyst, IT Architect
I will also:
- Remove from the list all our current customers (you don't want to outreach them, right)
- Add any solution or technology terms included in some job titles, ex: "SAP" (SAP analyst, SAP database analyst, SAP manager, and so on).
- Wipe anything below a certain employee headcount in the company only to get the big ones
- Add the function/vertical: "Information Technologies".
- Wipe out everything like students, trainees, apprentices, trainers, coaches, consultants, freelancers, etc...
- Add location
- Spoken language
Now, let me show you what I mean by hyper-targeted search:
Sales Navigator allows the use of operators in the search bar like Linkedin, but better:
It will enable the use of negative filters.
Those are the terms you see highlighted in red on the left above.
To use negative filters, hover your filters in blue on the right corner of the bubble. You will see a little stop sign appearing on the right corner of the filter bubble:
Click on it, and it will turn to a negative filter in red. Negative filters are easy to do and will allow you to make super surgical search filters.
I am going to set this straight: Linkedin and Sales Navigator search filters are inaccurate!
Linkedin UX focus on making their users dwell on the platform. But their goal is to show them the least possible amount of data to avoid scraping and increase dwell time. So you have to refine these search filters manually. It takes time.
Worse, they often don't show you the current job roles from the profile from search filters. This issue occurs because most people forget to put an end date of their previous job end date.
That's why the most accurate way to know the current job role is to scrape their profile page itself, not trusting the one showing from the search results alone.
Last tip, to get more chances to reach people out, apply these additional filters:
- "Active in the last 30 days": target the profiles that are presently active
- "remove viewed leads from search": so each time TexAu is processing the search filters, you will only see new profiles. Super handy to go above the 2500 profile display limit per search (1000 profiles on LinkedIn, 2500 on Sales Navigator). Very useful when scheduling.
- "2nd-degree connection": start with those to benefit from higher limits than 3rd-degree connections.
Once you are satisfied with your filter, save it:
Then go to "Saved Searches", click on the one you want to use:
Once you open the search page, copy its URL. We will reuse that for our workflow:
Now, let's build our workflow.
STEP 1: Fire up a new module, this time selecting "Scrape Linkedin Sales Navigator Leads Search":
As usual, add your cookie and the search URL of your killer Salesnavigator search:
For this tutorial, and because of the processing time to get the results, I will only limit it to 20 profiles.
STEP 2: Fire up a new module, "Convert a Sales Navigator URL", to retrieve the original Linkedin regular URL that will give us access to the company domain name.
Same as before adding your cookie, and the "profileUrl" variable taken from the previous automation:
STEP 3: add new automation module again: "Scrape a Linkedin Profile" that will give the company domain variable:
Add the "defaultProfileUrl" variable that corresponds to the original profile URL (not Sales Navigator that is not the same for one profile):
STEP 4: add the "Find an Email Address" module:
Now we will add the "firstName", "lastName" and "companyDomain" variables from the previous module to find the pro email:
Once done, double-check, triple-check, then run the workflow:
Go to results:
The workflow is launching and will take a while to complete. Remember, TexAu will add random delays between page loading to mimic human interaction. These longer delays are necessary to ensure your account safety.
If you want to accelerate this process, plan a whole filter search profile scraping, then once you have hundreds of profiles, send it to that automation.
Out of 20 profiles scraped, got eight valid emails, one catch-all (potentially valid), and one unknown:
Email verification can be of 4 types:
• Valid = 100% verified, the target email server has sent a confirmation
• Catch-All or Accept-All = potentially valid. The company email server sends a confirmation no matter what email pattern we send, hence the name "catch-all". Often used by sysadmins of big companies to avoid people like us from finding the correct email. It can work or not.
• Invalid, self-explanatory
• Unknown = similar to a catch-all, but the server doesn't send a confirmation. The guess is made based on public data available on Google.
Pros: Super targeted with a good return on validated emails. The best method for B2B.
Cons: Slower, due to Linkedin safety limits. However, if you have a sales team in the same office, you could accelerate this process by splitting profile lists/accounts.
Here we could have gone faster just scraping the search result to get hundreds of emails without having as much wait time between each profile scraping. But as mentioned above, the issue would have been the accuracy and less valid emails as a result (often corresponding to a past job, while being considered "catch-all", for instance). Not very useful. Moreover, you can schedule a long crawl at night and slowly scrap all these profiles in advance to batch find emails weeks later.
Another way of finding emails is to scrape from websites by crawling their directories and pages directly.
You will most of the time find it on pages like "/contact" (generic contact emails) or on "/about-us" pages (with VP's, Execs, and Founders emails, for instance). So overall, it will be a hit or miss depending on what's present on the site. But still an excellent way to find emails.
Scraping emails from websites is a widespread use case for SEO purposes, like outreaching bloggers or other startups in a similar niche (not competitors) for guest-posting, backlinks exchange, podcast booking, and collab.
You can see this excellent post by Ahrefs about the process:
So for this exercise, I will also use Ahrefs to find potential blogs/startups I can outreach to promote TexAu or my fantastic automation consultancy services to them 😛 and maybe get a backlink in return too:
Let's open up Ahrefs (you can use it for 7$ on trial for a week) and search for "Linkedin automation" OR "Linkedin Growth Hacks":
Let's search for all the articles that rank for "Linkedin automation" OR "Linkedin Growth Hacks".
As in Ahrefs guide, let's select the content that:
- Only contains our keywords in the page title
- Has a Domain Rating (DR, quite similar to Moz.com Domain Authority but focusing more on backlinks strength as a ranking factor) comprised between 20 to 70?
- We will filter sites with a small DR and those above 70 here to gradually build our backlinks, and sites with higher domain ratings (DR above 90) are primarily big sites that will make our efforts harder.
- Are written in English like our audience
- Filter one page per domain and exclude homepages (we want blogs)
Once we have made our filtering, go to the "websites" page and export the domains list in Excel/CSV:
Once satisfied with your selection, go to the "Website" tab and export the results to get the domains:
Now upload the file to a Google Sheet and clean all your known competitors (except strategic partners you might integrate with), PR news wires, or any irrelevant domain you may find. We want either lead generation or SEO agencies, bloggers, and influencers:
Import the Ahrefs extract into a new sheet:
A handy thing is having the extract on Gsheet is that you can easily preview the page title, and meta description to get an idea if it's relevant or not for your outreach campaign:
Now, let's do some manual cleaning following the above criteria and keeping what we need.
Let's highlight in red the result we want to wipe out:
Now select it for deletion:
Once you are happy with the data cleaning, let's import it in TexAu to find emails.
Share your Google sheet setting its permission as "viewer", then copy its link:
Open TexAu and launch the automation called "Extract Emails and Phones from Websites":
Enter your Gsheet URL, map the first column (A) containing the domain names:
For max depth, level "3" for "Max Depth" should suffice for most cases:
Just an explanation about the "Max Depth" level:
Usually, in most cases, contact emails are found on the contact page located in the "/contact" directory of a website. This is level 2.
The homepage is level 1.
If you want to dig deeper to find emails on the website in /resources/about-us, this will be level 3, and so on.
The deeper you crawl into the website directories, the longer it will take to process.
Now, run the automation:
After a while 1h-2h, it will return the emails:
Hmmm, some familiar names there...
You can directly export the file to your favorite email outreach tool.
Pros: Ideal for finding the emails of the website owners, small blogs
Cons: Not suitable for B2B/corporate email outreach
The last method we will see here is extracting emails from Google Maps.
Let's search for plumbers in Brooklyn, New York:
While we are at it, let's add Manhattan to the list:
Next, let's add the Queens:
One more for the road with the Bronx:
Put the above on a Google Sheet and rename the sheet "Gmaps" or anything memorable for you:
Tired of entering all those searches manually? Alright, check this nice little yet easy hack below.
Look how the URL is composed:
- +Country (outside the US)
We could add this to a google sheet (remember, we are supposed to be lazy smart) and concatenate the above to rebuild those URLs for every city or job. And use TexAu with it.
In the New York example, we could go a find all the states' cities and paste that into a column:
You take this template to try:
Now you can directly input the same text query you could enter in maps search, no need to go to maps anymore and copy-paste your maps search URLs. Simply enter a list of text queries in a Google Sheet (ex: restaurants in Paris), and TexAu will find the search URL for you!
For this, let's make a custom workflow:
What this workflow will do:
- Take all the location listings above, put each URL in a Google Sheet as the data source
- Extract all the location listings up to 6 pages per location
- Get business names and phone from the listings
- Extract emails and phone from each website
- Put all this in a Google Sheet
First, let's create a "local variable" representing the column containing all the Gmaps URLs.
You can call it what you want; here I choose "gmapsList". Leave its value empty, then click "Add".
STEP 1: Fire up the first module in the chain, "Google Maps Extractor":
For the "Maps Search Link" field, we will add our local variable called "gmapsList". Next, we will link this variable to our data source on Gsheet containing all the Maps URLs we want to process.
For the "Number of Pages" field, add "6" to get the maximum data (6 pages maximum only per location).
Now let's filter all the businesses from the listings that have a site (not all do, yes), using a filter:
STEP 2: Fire another module, "Extract Emails and Phones from Websites":
Add the variable "websiteUrl" from the last automation:
For the "Max Depth" field, enter "3", which should be enough, as explained before.
Also, check the box for "RETURN SINGLE ROW". Otherwise, all the emails will be spread out in multiple rows resulting in an ugly file. So instead, let's put it all in one cell.
This setting will allow us to grab some more data in addition to the Maps listing:
STEP 3: Finally, let's add our final module "Send data to Google Sheet":
For the input, add:
- The sheet URL (set permission as editor since we TexAu write on it)
- Connect your Google account
- Start at row 2 if you have a header on the file at row #1. Then, TexAu will add the data below it.
- Indicate the name of the second sheet that will receive the scraped data
Let's map the sheet now:
We will use the template below that we labeled after the above-selected variables:
Finally, let's add our input Google sheet source ("Gmaps" sheet above). The 2nd sheet will receive the results.
Copy the exact Google Sheet link as above, then add the first sheet and first column with the Gmaps URLs:
Now let's add our input data source from Google sheet and click the CSV/Sheet button in the menu:
Add the first "A" column:
Please remember that the google sheet connection will only work IF your first sheet on the Google Sheet workbook IS NOT EMPTY! So fill your map URLs in the first column of the first sheet to allow the connection. Otherwise, you'll get a red error message.
Click the "Submit Google Sheet" button and launch the workflow:
Scraping all these results might take a few hours. So for this test, I ended up just scraping one location from the listings to speed things up a little:
And voilà, here are your emails:
- Very targeted and ideal for local lead generation
- Resource intensive and slow due to the automation crawling websites one by one
- Most local businesses aren't checking their emails (often use personal email, prefer phone, etc...) as much as in B2B.
Another excellent addition to TexAu that has multiple use cases is the Email Verifier module.
So to showcase it, I just grabbed an old CRM file containing many outdated data:
I uploaded the first 100 rows to a Google Sheet:
Then, I launched the workflow builder and created a local variable.
Same as before, we will use a local variable to map the column containing the data we want to process in our automation.
Let's call that variable "emailList". Leave the variable value empty.
Now, let's fire up the first automation, the Email Verifier:
In the input section, let's add our emailList local variable:
Copy our sheet URL and set the access as "viewer" (reading mode only):
Create another sheet in the workbook called "Verify". This Spreadsheet will be the output of our test to see what emails are valid or not:
Fire up the last automation module, "Send Data to Google Sheet":
Add your Sheet URL, connect your account, start row at two because of the header, and indicate the second sheet "Verify" as output:
Map the variables to the sheet:
- local variable: will output the emails we test on the first column
- verified: will output TRUE if valid, FALSE if not on the second column
Now go to CSV/Sheet:
Map our first sheet (Sheet1), and add the first column containing the emails.
Then launch the workflow by clicking the green button "Submit Google Sheet":
Next, go to the results section to see the logs:
You will see in real-time the email data output coming on the sheet:
Last, I wanted to quickly compare TexAu results with another reputable email verification tool: Emailable (formerly theChecker):
So far, TexAu got 17 valid emails out of 66.
In Emailable, we ended with 19 valid emails out of 66. So TexAu email verifier looks pretty good in comparison with only a 3% difference between both.
Cons: Slower than a dedicated email validation tool at the moment
This automation is pretty resource-intensive and will indeed cost you one email credit per email found. It certainly won't replace a dedicated mass email verification tool but will find its use for your daily email outreach needs using the many email integration TexAu offers. Later on, there will be a possibility like Hunter and Dropcontact to buy separate "one-time" credits for this application at competitive pricing.
Today we saw six ways to find emails in TexAu. More than a social automation tool, it's a badass lead generation sales machine that can apply to a wide variety of applications ranging from Sales to Marketing and SEO.
The main challenge is to be creative here because there are so many possibilities that it could be disconcerting at first without proper guidance. So it will take some time but worth it.
TexAu, from its modular approach, allows a level of freedom that no other "point and click" tools can offer.
How many Lead Generation and email finder tools did you buy and always have to switch from one to another to get WHAT YOU WANT. In TexAu, you can have it all inside the box.
Find a use case, break it down, and play with TexAu. There will always be more than a way to do it.
It's all about having a "Growth Mindset", and always thinking outside the box.
Hope you like it, see you next time.
... Now it's time to distribute the emails to the clients. The Don is waiting for the envelopes.