Gints

560 posts

Gints banner
Gints

Gints

@gintsg

SEO expert with 15+ years of experience. Thrives in most complex SEO strategies and concepts. With a passion for guitar, metalcore and trailrunning.

New Zealand Присоединился Nisan 2009
271 Подписки197 Подписчики
Gints
Gints@gintsg·
@gofishchris Using this for many years now. GReat extension! However, nowedays, it's pretty common that Google does the rendering within 3 days or so after cawling if needed. End of the day, the key is how the DOM is rendered unless JS/CSS files are blocked.
English
0
0
0
294
Chris Long
Chris Long@chris_nectiv·
Technical SEO Tip: The Web Developer extension allows you to easily see what content on a page requires JavaScript in order to load: This is one of my favorite extensions when auditing sites for JavaScript dependencies. The Web Developer extension has a setting that allows you to "Disable JavaScript". If selected, it will load the page without executing JavaScript, allowing you to see what the page looks like. You can then analyze what elements of a given page require JavaScript in order to load properly. In this screenshot, we can see that Steve Madden's entire category page requires JavaScript in order to render the content. Since this is a high priority page, we'd definitely want to check to ensure that search engines are able to properly crawl and index the content. If you're doing a lot of technical SEO, I'd highly recommend downloading the Web Developer extension. This one feature alone is extremely powerful in being able diagnose potential JavaScript indexing issues.
Chris Long tweet media
English
6
21
142
14.5K
Gints
Gints@gintsg·
@TheJackForge I think LinkedIn with or without this can be great for headhunters if your account is in solid state. Recent years getting some solid job opportunities from head hunters without this specific badge. But I would expect an extra one or two solid offer if using it.
English
0
0
0
333
Darth Autocrat (Lyndon NA)
@inkovic @gaganghotra_ CSS images were always a pain. The parser looks for specific things - in HTML. CSS BgImg's don't fit that pattern, so never used to get crawled/indexed (even if embeded or inlined). I don't think that ever changed?
English
1
0
1
40
Gagan Ghotra
Gagan Ghotra@gaganghotra_·
(NEW) Google updates page about image SEO best practices 🙂 "Google can find images in src attribute of <img> element (even when it's a child of other elements, such as the <picture> element). Google doesn't index CSS images."
Gagan Ghotra tweet media
English
2
10
43
8.3K
Gints
Gints@gintsg·
@lilyraynyc That's like an old days with 72hr delay.
English
0
0
0
62
Gints
Gints@gintsg·
@DavidGQuaid It's more about how it's represented by 3rd party tools. I don't deny it's existence, but it works much differently now. 3rd party tools too much focuses on quantity and DR fictional numbers without actual quality and relevance of each link.
English
0
0
0
17
David Quaid - AI SEO
David Quaid - AI SEO@DavidGQuaid·
You dont agree that PageRank exists or you believe you can make up Authority without backlinks? I mean - the view is that PageRank = external authority, a ranking factor. On-site SEO just maps authority to the rest of the site. So creating 1000 internal links over 100 somehow increases Authoirty - is that the disagreement?
English
1
0
0
30
David Quaid - AI SEO
David Quaid - AI SEO@DavidGQuaid·
We have to hold SEO "Visionaries" accountable "SEOs" make up theories all the time and Google gets blamed. Yesterday, a PR person claimed that Google is trying to act like a human. There is no evidence for this. How can a machine exhibit a lack of bias by becoming biased? What biases would it pick? You need bias to be human. Google documents EVERYTHING. Its typical of EVERY engineering led company. Google CAN be reverse engineered. Its been reverse engineered: PageRank was published as a patent. PageRank, not bias, not research, not LLMs is the foundation for Search Microsoft's Bing knows this. Yandex know it. Duck Duck Go know it - they are all implementations of Google's PageRank. SEMrush know it, Ahrefs and Moz all know it - they reverse engineered PageRank too to get DA, SA and Keyword Difficulty. Google's SEO Starter Guide confirms it. Where is the evidence for Bias architecture? There is none. There is hyperbole. We have to hold SEO "Visionaries" accountable
English
2
0
0
206
Gints
Gints@gintsg·
I would assume it's just for users, it's sometimes common that people search login pages. As silly as it sounds. As you have a lot of internal links on every page, Google just index it, probably just sometimes use for sitelinks. It's silly Google's issue that I have seen for at least 4+ years now. It can happen with other pages that has a lot of internal links and blocked by robots.txt which is not good method to use in these cases.
English
0
0
0
19
Simone Semprini
Simone Semprini@simo_sempr·
@gintsg @rustybrick @methode Yes exactly. But what's the point of indexing those pages? There are important pages with long content not even indexed, while thousands of login pages with redirect are indexed.
English
1
0
0
30
Lily Ray 😏
Lily Ray 😏@lilyraynyc·
My teammate @SavannaLGray noticed this highlighting in meta descriptions and warned me I might get “not newed” if I shared it. But I think it’s new so I’m taking my chances. @rustybrick? 🏆🏆🏆
Lily Ray 😏 tweet media
English
8
0
16
6.8K
Gints
Gints@gintsg·
@jonoalderson There are many new SEOs who read old best practices and think that they have found the holy grail. It's not just about old practitioners.
English
0
0
1
654
Jono Alderson
Jono Alderson@jonoalderson·
So many of the early SEO practitioners are now completely useless. No understanding of how websites work. No understanding of how businesses work. No critical thinking. Just a bag of tricks that no longer works. Yet, still, they seem to cling on, selling bad consultancy. Sad.
English
27
7
110
13.4K
Gints
Gints@gintsg·
@simo_sempr @rustybrick @methode They do not crawl it, they just index it. Login page is almost always indexed if blocked by Robots.txt. I would assume, login page with redirect parameter which generates 1000s of versions?
English
1
0
0
39
Simone Semprini
Simone Semprini@simo_sempr·
@gintsg @rustybrick @methode On search console we have thousands of URLs that report the saying "Indexed, though blocked by robots.txt". Most are unaccessible login pages. But Google crawler keeps crawling them
English
1
0
0
37
Gints
Gints@gintsg·
@karbofoni @lilyraynyc Exactly my thoughts, user signals are important and it seems like that are impacting rankings globally.
English
0
0
0
13
Karbofoni
Karbofoni@karbofoni·
@lilyraynyc Seems that user signals are not geo related/filtered anymore (most internet users are in India I guess)
English
1
0
1
585
Lily Ray 😏
Lily Ray 😏@lilyraynyc·
Our team is inadvertently finding more and more examples of Indian sites/content ranking in top positions in U.S. search results. I thought this might have just been an AIO thing, but it seems to be pretty rampant in the organic search results right now too. Like this Weather.com featured snippet result from their Indian site section (in English) Has it always been this way and I'm just noticing it more? Or did something change???
Lily Ray 😏 tweet media
English
42
7
84
26.5K
Gints
Gints@gintsg·
@lilyraynyc Additionally, if user signals are important, India is one of the largest countries and user signals could start impacting results globally as well.
English
0
0
0
25
Gints
Gints@gintsg·
@lilyraynyc Just a thought, wouldn't it be related that user diversity is common across the globe where user signals from location impacts rankings? Every country has different people who still search and is part of the overall user signals pools and may prefer their home country results.
English
1
0
0
131
Gints
Gints@gintsg·
@jonoalderson IMHO it's partially true, there is room for common issues, where prioritising must be dependant on situation. Sometime multiple issues can be combined per project based. Focus on solution instead listing 100 issues is the best audit.
English
0
0
0
150
Jono Alderson
Jono Alderson@jonoalderson·
A technical SEO audit done via a template isn't worth the metaphorical paper it's printed on.
English
10
3
34
7.7K
Gints
Gints@gintsg·
@gaganghotra_ These are some good points! This is one of the reason why I always try to do some qualitative SERP analysis if allocated time allows. You always can squeez out from SERP analysis something that no tools will tell you.
English
0
0
1
30
Gagan Ghotra
Gagan Ghotra@gaganghotra_·
Most search features for generic short tail queries end up showing women products - this is something that is quite common in Australia search results and its not like oh Google is biased or something - its just Australia women spend way more shopping online than Men different research studies have highlighted this. This demographic difference is something that MUST be considered while building an SEO strategy - when I work for a women only clothing brand content or search features optimisation recommendations are different than what it is if I'm working for a men only clothing brand. Frankly its way easier to optimise & rank women only brand page for a short tail & generic query than for a men only brand - that's true for Australian search results , in other markets it can be different. An example query below "buy pants" search feature "Popular Products" just shows women only products.
Gagan Ghotra tweet media
English
6
1
3
781
Gints
Gints@gintsg·
@thatkatieberry Thank you! This actually triggered some thoughts and started looking into this. Found Chrome extenstion that does it automatically. All my clients doesn't have ads, so somehow haven't thought about this much. Thanks!
English
1
0
0
32
Katherine Argent
Katherine Argent@effthealgorithm·
@gintsg My ad company does it for me. Manually it would be a challenge but you could use your CMS’s mobile preview to figure out how many screen lengths it is on the average screen size for your site’s traffic, then space ads accordingly.
English
1
0
1
171
Katherine Argent
Katherine Argent@effthealgorithm·
Site owners with ads: if it feels like you should be making more based on the number of daily visits, check two things: 1. Make sure your font size & line spacing isn’t ridiculously small. Today’s standard is 16-20px, depending on the font. Text-heavy site? Go for 18-20 or those of us who are older will use the Reader View in our browser, which means we never see your ads. (And by “older” I mean anyone over 40.) 2. Do not listen to your ad company on recommended density! Tell them what YOU are comfortable with on YOUR site. Their job is to sell ads. Period. Your job is to make readers happy. These things don’t always align. For example, the 30% recommended density is supposed to mean ads don’t take up more than 30% of a viewport. Yet most ad companies will tell you it means 30% of the page, and by “the page” they mean from the title tag to the bottom of the comments, even though they’re only putting ads between the 2nd or 3rd paragraph and the end of your post. That’s how you get penalized for excessive ads while being told you’re within the Better Ads Standards recommendations. But even before that penalty, you’ll be losing readers because it’s just. too. much.
English
11
2
26
5.3K
Gints
Gints@gintsg·
@BrockbankJames @searchliaison Great questions! End of the day, Google does reward some originality, if you are the same as 100 other similar pages in SERPs, the only way how you can outrank them, be more popular with links, traffic, mentions etc. Answering your suggested questions, site owners can get there!
English
0
0
0
24
Gints ретвитнул
James Brockbank
James Brockbank@BrockbankJames·
I’ve been thinking about how this comment from @searchliaison could be interpreted in a way that gets site owners thinking about why their sites got hit by the HCU… "Everyone should focus on doing whatever they think is best for their readers."
English
2
2
5
1.7K