SEO Conspiracy News #15

This Week on the Digital Marketing World

  1. A thread on Reddit asks the question : « How does Google find a page without backlinks? »
  2. Optimyzr shares a script to see the impact of reduced keywords report for Google Ads.
  3. Is Facebook really trying to fight against fake news?
  4. Microsoft shows us what an underwater DataCenter looks like.
  5. Twitch signs a partnership with Sacem, the French music copyright organization.
  6. Watch Google propaganda at play about the effort to save the Earth.
  7. Piratebay.org domain sold for $50 000.
  8. A study shares agressive SEO tactic.
  9. Another study documents a journey into improving Google Ads spending.
  10. GoogleBot will crawl http/2
  11. Sistrix tested out Core Web Vitals for most popular Content – Management Systems (WordPress, Wix, Squarespace, Joomla, etc.)

On Reddit: How does google find a page without any backlinks?
Very good question, many answers.
Google can have very creative ways to find a new page on the web. I’ve tested so many things, for example, just sharing a link on gmail and finding it indexed on google. Many many like that since I started in 2004.
It’s one of my hobbies to test out how google will find a page however I will not draw general conclusions, these are just tests I’ve done on a personal level.
This channel is called seo conspiracy for a good reason, because I don’t want myth to spread and it’s been going on for too long.
Furthermore this question might have many answers but at the end of the day does it really matter?
If you know how to build links applying the topical mesh, my internal linking strategy, you can find it on the semantic seo playlist.
It will work you will have no problem indexing new pages. Second, if you don’t fear the google psychological warfare tactics trying to make you believe that backlinks are evil, if you’re not scared of backlinks, then it’s the number one asset you have to find backlinks.
A few weeks ago I reported on the indexing services, it was on blackhatworld and apparently there is one indexing service that still works if you do heavy spam, like trying to index hundreds of thousands of pages a day, built on the principle of the landing page,
yes, you might need some help indexing but for the rest of us, if you know how to build a strong site from a pagerank standpoint, let me remind you that domain authority does not exist. Pagerank exists, citation flow on majestic is pagerank. Trust flow is the topical pagerank, that’s all you need. You need strong assets you need at least 40 and 40 on majestic metrics which corresponds to a pagerank 4 and in my opinion is enough to have all the on page and onsite assets. And finally you only see the first phase: google discovers a page, google discovers 25 billion spam pages a day so then google has to decide
okay I discovered a page do i index yes or no if i do index then you got a lot of operations, they gotta calculate pagerank anti-spam relevancy legitimacy authority all this and at the end you see, we see the first phase and the last phase. Google discovers a page, google returns that page on the search engine result page. Everything in between is invisible to us, at least check log files. First phase, when does google discover your page and then check the last when does google return
the page on the search engine result page and at the end of the day it won’t help you rank. Whatever you do to index faster doesn’t mean you’re gonna rank higher if you do apply a good sound strategy like the topical mesh you will be discovered indexed very fast and you will rank first.

Optimyzr shares a script to see the impact of reduced keywords report for Google Ads.

On SearchEngineLand: How much does Google’s new search term filtering affect ad spend transparency? Here’s how to find out.

A couple weeks ago I reported that Google is starting to filter out some keywords on the google ads report.
It’s a problem and here these guys from optimizer shared a quick script, filters for the console and you will see the difference what
they found out is that impressions were already filtered a lot before google’s announcement on september 2 2020.
There was also a pronounced decrease in visibility into queries driving clicks & cost smart shopping campaigns so less of a change because most search terms were already unavailable to them anyway.

Is Facebook really trying to fight against fake news?

On Techcrunch: Leaked memo excoriates Facebook’s ‘slapdash and haphazard’ response to global political manipulation

Leaked memo is telling us that facebook response to the
global political manipulation issue is mostly PR. This memo reads it’s an open secret within the civic integrity space that facebook’s short-term decision largely motivated by pr and the potential for negative attention it’s why i’ve seen priorities of escalation shoot up when others start threatening to go to the press and why i was informed by a
leader in my organization that my cv work was not impactful under the rationale that if the problems were meaningful they would have
attracted the tension became a press fire and convinced the company to devote more attention to the space facebook responded saying that
it was mainly about fake likes and they were facing more urgent and harmful threats globally. Of course it’s all about pr; I mean, come on they don’t want 2016 to happen again.
All the accusation and especially after the big tech hearing, Apple Amazon Facebook Google. Mainly Google & Facebook and even Twitter who was not in the big tech hearing show good faith they have to make a force trying to prove that they don’t manipulate them well the political bias you know meaning that each side has equal opportunity to be filtered out or not. From whatever the users are seeing it’s hard they won’t be able to be 100% effective. Fake news will still spread some political bias can be found it’s technology it’s not perfect and you have bugs and even at the top do they really want to change things or not it’s an open question.

Microsoft shows us what an underwater DataCenter looks like.

Microsoft finds underwater datacenters are reliable, practical and use energy sustainably

If you’ve never seen what an underwater data center looks like, go on news.microsoft.com. It’s just a cylinder that looks kind of big and supposedly they will extend this type of deployment going underwater will serve many problems, especially in regards to cooling but it will definitely not solve the increase in needs for data centers and resources and electricity and and so on we’ll talk about it in a couple of seconds
when we come to google.

Twitch signs a partnership with Sacem, the French music copyright organization.

Twitch is signed up in a partnership with Sacem.
Sacem is the official french organization that will retribute singers authors or musicians for their work.
What’s important here is if you follow the twitch drama when it comes to dmca reports this is a big move towards making sure that
it’s not the wild wild west anymore on twitch as far as copyright goes because it’s not fair on youtube or facebook you can’t stream anything you want and for the past well how old is twitch like eight nine years
you were able to stream whatever you wanted as far as music or video or tv didn’t matter and i see a lot of moves for music so yes let’s regulate that because copyright is something you need to respect don’t
use other people’s work without respecting their copyrights.

Watch Google propaganda at play about the effort to save the Earth.

going back to this whole save the planet announcing google a
third decade of climate action are most ambitious yet. This is google propaganda at work. Don’t believe the hype they say that today
they achieve 100% renewable energy it’s a lie. They buy energy tickets
so they spend electricity like anybody else but then they give back money that is invested in renewable energy projects and they claim that by 2030 there will be the first major company to operate carbon free. good luck with that until then last time i checked there were only two data centers they were mostly carbon free using windmills and also the way they invest in the sustainability bonds it’s pretty controversial they invest in technology that are not so sustainable and also they don’t count in whatever is going on with mobile in there because you got huge carbon footprints by using mobile and also they don’t count
what it cost to build those windmills those solar panels and recycle them it’s out of the equation so let’s not play pretend here it’s just a
pr move let’s remind ourselves that the demand is increasing
there will be more data center we require more power, we require more computer we want more bandwidth, doesn’t sound good that’s all I’m saying.

Piratebay.org domain sold for $50 000.

Piratebay.org Sold for $50,000 at Auction, ThePiratebay.com Up Next

Piratebay.org was sold for fifty thousand dollars and thepiratebay.com is next. By the time you watch the news most likely thepiratebay.com will also be sold. It’s a nice price fifty thousand dollars, for piratebay.org now what do you do with this i don’t know all i know is piratebay is using the .fun extension right now if you want to find torrent.

A study shares agressive SEO tactic.

[Case Study] Ranking a parasite for a high KD & high traffic keyword by reverse engineering Google algorithm

Case Study – Ranking a parasite for a high KD & high traffic keyword by reverse engineering Google algorithm

Very ambitious study because in my opinion this guy
did not reverse engineer anything he just applied traditional tactics
in optimizing content looking at user intent and giving google what it wants is what i do what i’ve been doing it for at least 10 years working on semantic seo so good job in my opinion from my experience it’s nothing special spectacular because that’s all i’ve been doing and i think you should too.

Another study documents a journey into improving Google Ads spending.

My Google Ads adventure. Buckle up!

This study is pretty cool because this guy totally documented his journey into ad spending and it didn’t go so well at the beginning
then spoiler alert at the end he started to improve but it’s nothing impressive.

GoogleBot will crawl http/2

Googlebot will soon speak HTTP/2

Finally! I’ve been waiting for this because http 2 is what we should be using especially if you use https so thank you google.

Sistrix tested out Core Web Vitals for most popular CMS

Core Web Vitals – Wix vs. WordPress, Shopify vs. Shopware – What’s fastest?

Wix versus WordPress versus Shopify versus Shopware versus the rest
of them all and WordPress is very bad, Wix very bad, Squarespace very bad, Joomla very bad. they’re all even worse than Whibley or Drupal and the best seems to be Jimdo, Yypo3, Modx, Pincore and Spip is actually doing pretty good so we’re all using wordpress and I don’t know why wordpress works actually it’s unbelievable it’s this script i use it myself but idk why it’s working it’s a mystery magic works on the web.

Listen to the podcast

Watch the video

Latest posts