Hey, Bruce, this is Vince getting back to you here.
I appreciate you sending that in.
So I can absolutely help you with this.
Um The video is a little bit tough to understand like really, really just everything that you were trying to scrape.
And so I just want to give some clarity on some easier ways to scrape off of linkedin, the scraping in general.
And then, um and then I can offer some guidance if you don't wanna do it that way and then do some things to pay attention to.
So here's, here's what I would recommend.
Uh I'm gonna open up linked in here and so you might want to split this into a few different automation that way.
It's just cleaner.
There's less chance for it to, to be stopped and, and have any errors.
Let me log in really quick.
OK? I'm logged in.
And then what I wanna go to is let me actually just repeat what you were doing skin.
Let me grab the filters that you were using key words for software developer.
Got a team down.
Yeah.
All right.
And so what I would recommend is, I mean, do what you normally do like you just did with the um So do what you wanna do as you normally do.
You click enter here, open up the search bar for the filter to search.
Oh I went through and then I want my keywords here.
I'm not sure if I got what you got, what you got you in jobs and jobs.
OK? And then United Kingdom and extra clicks here don't matter because what I'm gonna do is I'm gonna delete some of these steps and just add in a direct, direct URL um to the page that I want to scrape from.
OK.
So then what I would do from here is capture some scrape steps.
So if you want like if you want this information, for example, this is one way to build it.
I think it would be cleaner.
Um There was a lot of audio in your, in your video.
So I didn't know what the whole outcome.
I don't know the exact outcome of what you're trying to achieve.
But if it were me, what I'd be doing is I would scrape list and I would be grabbing the names of the company so that I can choose either text.
I can grab the text and the URL name and direct URL.
I'll grab URL first and I'll be in column A and I'm gonna grab then the name.
So I know which company it is and then I'll grab text and then what else do you want here? Oh, we want a location.
Let me grab, I don't need your order.
All of this.
Here we go.
We got location.
So I have the URL, the name and then the location here and I just need to text you that since the new er L was not offered.
And this guy, OK, then from here, I'm not sure exactly what you want anything else, but what I would do is I would stop the automation.
Now, this would be automation number one.
And so what this will be is uh this will, I'm gonna come in here.
This is gonna scrape all the direct URL S to uh a spreadsheet.
And then I'm gonna attach a build a second automation to loop through each or each URL and scrape the data I need to each URL.
So this one log in sign in search, I'm actually just going to build a delay step here.
Uh And what's I and, and then I also, what I want is the direct URL of um of that page.
So what I'm gonna do is log back in here.
Give me a sec.
OK? So log back in and in a separate browser.
I'm gonna repeat for jobs.
Is that not opening up a second wars? There we go.
And jobs.
OK? United Kingdom.
OK.
So I have and, and the separate, this is not recording, this is not Chromium.
I'm gonna copy this URL.
So it can be in any URL, any um browser Safari Foxfire doesn't matter chrome.
And then I'm gonna add a direct navigate step to go directly to that page.
That way it's cleaner.
I don't need all these steps which is click search enter and these are all places where it could fail.
Now, if you do need, require these steps or if something's want, if you want built in, we do have an allow air feature.
So I would recommend clicking, allow er on any step that's not required to achieve the outcome of your automation.
Enter.
As you can see here, there's a lot of steps here.
Jobs.
United States.
And really, I just need to get to my re steps, ok? I just need to get to my scrapes up.
So I have my scrape link, my URL, the name and then the I think the location, then I'm gonna send this to a Google sheet.
OK? Let me just choose.
I'll just do spear tests here.
Let me edit to make sure I have three columns with headers in it.
Great test you go.
So we have the company URL, no company name and the location.
I want them all bolded.
Ok, cool.
And then I'm gonna send, I already have it connected.
So let me just check my sheet connection.
Yeah, make sure it has my headers.
That looks good.
Step eight is link's name.
OK? Cool.
And then I'm just gonna run this just scrape all that data.
I gonna trigger this to run and should be scraping data to this page here.
I'll log into my linkedin.
We have a single sign on feature too.
So remember if you log in so you don't have to log in every time, which will help reduce any errors or pro log in prevention.
Cool.
So you can see the page here not be directed to the page a little bit cleaner.
Let me see.
It's navigating through some of those scrape steps.
Ok? So it then scraped the company URL company name and location.
Now I only grabbed the that I only grabbed the uh it only grabbed the first few that were visible in this screen.
So now what I wanna do is add in a scroll step.
Let me just add a delay to let it finished rendering and then I'm gonna add in a custom scroll step and then I can choose how many, how many seconds I wanted to scroll for.
So we'll say I can test it in normal browser.
Oh, and we also have pagination in here too.
OK? So good to know.
We're gonna need to click.
We're gonna need to build ation as well to get all the pages if you want.
So let me just say uh 15 seconds, I want to scroll down for 15 seconds.
And then I'm gonna add in a step, a custom click stuff because I didn't grab it, which is gonna be looping through and I'm gonna need to attach a loop through.
Yeah, Ed.
Yeah, I want to grab this button here actually.
Uh yeah, grab the ex pat and selector.
So and the cool.
So what this is gonna do it will log in uh navigate directly to that page.
Scroll down 15 seconds and scrape all of everything that I need.
Company link, company name and then company location and then it's going to click on the next page and then I wanna now build in a loop trigger.
I'm not gonna attach any sheets.
So I'm gonna skip that and I want this to loop through.
I have no idea how many times here.
Let's see.
You can choose how many times you want this to loop.
You know, we could say loop through 20 times, but which means it's gonna go to 20 different pages.
You can see there's 17,000 results on here.
So I'm just gonna choose loop through three times for this for time purposes.
And then we can schedule this to run off the desktop.
Whenever whenever you want, I have my loop, set my loop steps and loop setting built in here.
So what I don't want it to loop back and log in and I'm gonna start the loop at, at step nine.
So as soon as you go, it clicks the next page, it's gonna scroll down, grab the links and then click next.
It'll do that three times.
So I'm gonna run this really quick.
Um Now our scroll down step, there is a bug with our scroll down step, which is under, which is now being developed right now.
So by the time you get this video and try this out, it should be fixed and it should be updated, but it's likely not gonna scroll down, but we need to scroll down in order to grab all of the companies off that page.
And if there's any errors we can and work from there, I'm thinking it's gonna air at the click step which is a click next page.
Um And I can play with the selectors and the X pass, but it might be better just to recapture steps.
Yeah, aired, aired at 13 as I thought it would.
So in that sense, here's what I would do.
I would just take this recapture steps here and navigate directly to that page that you need.
Or actually let's go to linkedin first log in, but it captures a login.
OK? I am in linkedin and I'm just gonna navigate directly to that page copy RL and then grab my scrapes ups.
So we're looking at great list.
It's pretty much repeating this.
So I'm gonna click URL Craig list, clicking text scrape list location, which will be text only.
So I have all those lined up here, then I'm gonna have it scroll down and I want to capture this next button.
Ok.
Good.
To go here.
All right.
So uh it's gonna log in and I'm going to add in a delay step.
It was important add in a direct, navigate, direct URL set to navigate directly to that page.
And then another delay allow it to finish rendering.
And then I'm gonna add in a scroll down, which I'm confident right now is not gonna work, but it will here soon.
We're updating it right now, let's say 15 seconds and then it's gonna scrape all this data.
It's gonna click and I want to send two sheets which should already be connected.
Awesome.
And I'm gonna set up my trigger, trigger should also be connected.
So I, all I did was recapture steps.
OK? Cool.
And let's test this out.
Keep in mind I'm not confident it's gonna scroll down.
This should at least capture the next page or that uh need to deal with the pagination on the page now since it, there we go.
So it did capture.
So it's gonna loop, it's gonna loop through three times.
Um So remember it's only gonna grab what it's facing here so I can see it's not scrolling.
So that's the one thing that we do need to update, which we will update as soon as possible here.
Yeah, and it's good.
I'm just gonna loop through three times.
All right, cool.
So we got more data.
Um But we did not grab all the data off of each page because the scroll, scroll down didn't work right now.
So that is automation number one.
So I'm gonna stop the video now and then I'll show you automation number two