All right.
Video number two for uh building a use case to create uh realtor information off of realtor dot com or Zillow.
So let me help out with this.
All right.
So let me go to log in the desktop app.
Let me get to the right workspace here to start building and I'm going to click admin automation.
So I think I believe you're gonna want to build this into two different emails or two different uh automation if you're trying to extract individual data for each, each realtor.
So let me just go to realtor dot com and I don't know how to get to the realtors find realtors.
I wanna go here.
You could go to the agents in 97005 search.
OK.
So if there are URL S that we can extract, we, we should be able to extract those URL S from these people.
So you want the, you probably want the individual realtor information, I'm assuming.
So you want some data in here.
OK.
Good to know.
All right.
So we're gonna wanna split this up into two different automation.
The first automation is going to be individually scraping the and extract.
Oh, sorry, not scraping, but extracting the URL of each one of these realtors.
So I just need to find where that URL is.
Usually it's in the name.
So what I'm gonna do is I'm gonna go to scrape list, create a list hover over the name and because it's a list, I wanna select two names in a row.
Well, my, I'm gonna pause here really quick.
OK.
So what I wanna do is this a repeat here? I'm gonna click scrape list and I want to extract the URL.
So I'm hoping there is a URL here.
OK? Perfect.
There is.
So what I'm gonna do is select the URL if this page has page, which it does.
OK? Then I also want, I also want to collect the, the next step.
So it's either gonna be one, either one of these here.
So let me just do this.
I'm gonna do, I'm gonna do both.
I'm gonna capture both just in case one of them works better.
God, I don't know.
OK? Let me, I'll explain my thinking here.
I don't want to just skip past the page.
So I'm gonna delete the last click step.
So I think I believe this first quick step was going to be enough.
And then I now have the skeleton.
So scraping built to com, I now have the skeleton of my automation.
So it's gonna go to uh realtor dot com.
Click find filters, uh click the the zip code and then it's going to or type search scrape links and then click the next button.
So what I would do see here with realtor dot com, I can have it navigate directly to a specific page, right? So I can do this in a separate browser.
I can go to find realtors say if I wanted to find realtors just in my area and to clean this up.
So what I'm gonna do is I'm gonna copy this URL and then I'm gonna include this URL stub.
I wanted to navigate directly to here rather than having to click, find agents and whatnot.
So I'm gonna delete all of these other steps.
So I hope this helps.
Um I could extract all of these, these URL S put them into a sheet and then connect the sheet and have it loop to each one of them.
I could do that too.
Keep this simple.
I'm gonna have to send data to, yeah, I need to make sure I have headers um to have and I'm gonna connect the sheets to this automation.
So I make sure it sends the data here dot com.
Screen test.
Found my headers.
Good to go step two.
OK? Now I do want to add a scroll down step.
So because I wanna make sure I grab everybody, it loads everybody on the page.
So what I'm gonna do is go to custom scroll down, say have it scroll down for four seconds just in case.
And then this is good to go.
So I'm gonna quick play ups and so this should be pasting data to this sheet here.
This is part one.
So then once I have all of the URL S, by the way, I could have done two scrape tests, one of them could have been extracting the URL and the second scrape list could have been uh uh scraping their name.
So I know who that URL is connected with.
But for this purpose, I just made it simple.
This automation to toto video made it simple.
Cool.
OK.
So I didn't get a chance to see what I was reading there.
It was probably detecting something with a bot.
But anyway, we do have um all of the realtors here.
OK.
Now, if I wanted this to scrape multiple pages, what I would, what I will do is clear it up, click, set up trigger and then I am going to uh use a loop through trigger.
I'm not gonna attach a sheet unless you wanna attach a sheet with URL.
S and I want it to loop through and I want to loop through say 10 pages of extracting realtors.
I can schedule it to run automatically if I want to.
And what I want this to do is is loop from step 1 to 4, 10 times.
So that's the way you want to do that.
This is part one of uh scraping the details that you need.
I'll record part two here in SEC