[Internet Policy] - Restricting the series of tubes

1356775

Posts

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited January 2014
    Re: giving google systems priority

    Searches for
    email: in order: yahoo, gmail, hotmail, news links, wikipedia
    maps: google maps, news links, yahoo maps, mapquest, bing maps
    books: amazon, B&N, google books, news links
    news: google news, fox news, cnn, yahoo news, news links, cnet, local news site
    search engine: wikipedia, duckduckgo, yahoo, ixquick, bing, dogpile

    Sure, the google thing is close to the top, but not always and everything else I'd expect to see is also there
    khain wrote: »
    Everyone is missing the point of Hedgie's example. The problem is that users cannot find the site via Google, but the solution everyone is suggesting of the user switching to another search engine or accessing the site directly is viable for some users, but the site itself is screwed in most cases because traffic from search engines, and specifically Google, is vital and cannot effectively be replaced.

    The complaint is that Google de-listed a site (which really was probably not de-listed, just not first page) because they attempted to exploit the ranking algorithm. The very ranking algorithm which made Google win vs the half a dozen other search engines back when it launched because it filtered out the spam results better. What is your solution here? Either google can manage its rankings via pruning and re-ranking, or it cannot and we get spam back in the results everywhere

    That particular site may be legitimate, but the method it was using to get ranked higher is the same ones less legitimate sites use, and it was fixed when they stopped doing that. Seems quite reasonable to me

    Phyphor on
  • RozRoz Boss of InternetRegistered User regular
    khain wrote: »
    Everyone is missing the point of Hedgie's example. The problem is that users cannot find the site via Google, but the solution everyone is suggesting of the user switching to another search engine or accessing the site directly is viable for some users, but the site itself is screwed in most cases because traffic from search engines, and specifically Google, is vital and cannot effectively be replaced.

    So your proposal would be to regulate Google like a monolopy even though it is not one? And this would benefit Net Neutrality how?

  • AngelHedgieAngelHedgie Registered User regular
    Phyphor wrote: »
    Re: giving google systems priority

    Searches for
    email: in order: yahoo, gmail, hotmail, news links, wikipedia
    maps: google maps, news links, yahoo maps, mapquest, bing maps
    books: amazon, B&N, google books, news links
    news: google news, fox news, cnn, yahoo news, news links, cnet, local news site
    search engine: wikipedia, duckduckgo, yahoo, ixquick, bing, dogpile

    Sure, the google thing is close to the top, but not always and everything else I'd expect to see is also there
    khain wrote: »
    Everyone is missing the point of Hedgie's example. The problem is that users cannot find the site via Google, but the solution everyone is suggesting of the user switching to another search engine or accessing the site directly is viable for some users, but the site itself is screwed in most cases because traffic from search engines, and specifically Google, is vital and cannot effectively be replaced.

    The complaint is that Google de-listed a site (which really was probably not de-listed, just not first page) because they attempted to exploit the ranking algorithm. The very ranking algorithm which made Google win vs the half a dozen other search engines back when it launched because it filtered out the spam results better. What is your solution here? Either google can manage its rankings via pruning and re-ranking, or it cannot and we get spam back in the results everywhere

    That particular site may be legitimate, but the method it was using to get ranked higher is the same ones less legitimate sites use, and it was fixed when they stopped doing that. Seems quite reasonable to me

    The decision was justifiable. This time.

    What if Google manipulates results so that the positions they support have preference over the ones they oppose? Would that be okay?

    What if a small startup has a technology that Google wants, so they cripple search returns to weaken them and make them vulnerable to a buyout?

    The core argument for regulating ISPs (which I agree with, by the way) is that if they control the playing field, they can cripple the small guys. My point is that they're not the only ones.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Those things are nothing like the rap genius thing! Should they manipulate listings to their advantage arbitrarily? No. But that's not even close to what happened in this case so I'm not sure why you're worrying about it so much

  • shrykeshryke Member of the Beast Registered User regular
    Roz wrote: »
    Richy wrote: »
    What I'm understanding from the admittedly brief information presented in the thread is that Rap Genius was abusing Google, by committing some kind of PageRank shenanigans to boost their results, so Google punished them by manually removing them from the results (presumably after warning them to stop and being ignored?) and once Rap Genius straightened up and flew right they got put back in. Is that about right?

    Honestly, that's not evil or scary or anything. In fact it sounds like a pretty fair system to me.

    But more importantly in the context of this thread, it's very much different from what Verizon wants to do.

    Verizon is not planning to go after sites that exploit some kind of flaw in their network. They are planning to go after all sites for money. They plan on selling preferred traffic treatment to sites that pay them to or earn them money somehow, and curb traffic to sites that don't pay as much. The only "abuse" they are going after is the "abuse" of not giving them enough money.

    Verizon is not eliminating Net Neutrality because they want to punish abuse. They want to become the abusers.

    Google already does that to a degree with sponsored links. There's also been points raised about them giving their own systems preference in search. So the difference isn't nearly as wide as you think.

    As for the infrastructure vs. service argument, I'd say that Google has long since passed from the latter to the former. Network effects point out why the "use another search engine" argument really doesn't work - after all, when it comes to search engines, who's left? At best, you have Bing, which remains viable mainly because Microsoft wants its own network service stack.

    The fact is that if Google delists you, you will effectively cease to exist to users. No matter how defensible the Rap Genius decision is, it doesn't change the fact that Google has that power now.

    Except you don't effectively cease to exist. You still exist just fine. The public's ability to reach you remains completely unimpaired. They simply can't search for you using one company's search engine - every other service engine will work just fine. Direct resolution will work just fine.

    If this were in the physical realm, this wouldn't be much different than a very popular guide book to a city removing your diner from their list of top diners to visit. There are other guide books - they may not be as popular - but your diner didn't stop existing. People's access to your diner wasn't impeded. But the end result is the same - less traffic to your diner (website).

    Yes, but if we don't ignore reality, we have to acknowledge that people won't switch search engines and, as the stats linked above show, your site will basically lose almost all it's traffic.

    The reality is that Google can effectively wipe your website from the internet if they so choose. Whether that worries you is a whole other question, but they have the capability.

  • Dunadan019Dunadan019 Registered User regular
    Phyphor wrote: »
    Re: giving google systems priority

    Searches for
    email: in order: yahoo, gmail, hotmail, news links, wikipedia
    maps: google maps, news links, yahoo maps, mapquest, bing maps
    books: amazon, B&N, google books, news links
    news: google news, fox news, cnn, yahoo news, news links, cnet, local news site
    search engine: wikipedia, duckduckgo, yahoo, ixquick, bing, dogpile

    Sure, the google thing is close to the top, but not always and everything else I'd expect to see is also there
    khain wrote: »
    Everyone is missing the point of Hedgie's example. The problem is that users cannot find the site via Google, but the solution everyone is suggesting of the user switching to another search engine or accessing the site directly is viable for some users, but the site itself is screwed in most cases because traffic from search engines, and specifically Google, is vital and cannot effectively be replaced.

    The complaint is that Google de-listed a site (which really was probably not de-listed, just not first page) because they attempted to exploit the ranking algorithm. The very ranking algorithm which made Google win vs the half a dozen other search engines back when it launched because it filtered out the spam results better. What is your solution here? Either google can manage its rankings via pruning and re-ranking, or it cannot and we get spam back in the results everywhere

    That particular site may be legitimate, but the method it was using to get ranked higher is the same ones less legitimate sites use, and it was fixed when they stopped doing that. Seems quite reasonable to me

    The decision was justifiable. This time.

    What if Google manipulates results so that the positions they support have preference over the ones they oppose? Would that be okay?

    What if a small startup has a technology that Google wants, so they cripple search returns to weaken them and make them vulnerable to a buyout?

    The core argument for regulating ISPs (which I agree with, by the way) is that if they control the playing field, they can cripple the small guys. My point is that they're not the only ones.

    1) corporations are made up of people, not villains in capes.
    2) it would not be worth the backlash risk if word got out that that was what they did.

  • MillMill Registered User regular
    Hedgie's example may not be the best, but it does sort of illustrate why it's a bad thing to not have net neutrality because the ISPs could easily cripple anyone for either not giving them enough money (fucking over the little guys) or just not liking them (again fucking over the little guys, who won't have the resources to fight such bullshit). I'm still mystified that there doesn't seem to be any motivation to treat broadband access like how we treated phone and electric access several decades ago (aka "fuck your profits, you'll run that utility out").

  • AngelHedgieAngelHedgie Registered User regular
    Phyphor wrote: »
    Those things are nothing like the rap genius thing! Should they manipulate listings to their advantage arbitrarily? No. But that's not even close to what happened in this case so I'm not sure why you're worrying about it so much

    Because your entire argument for why I shouldn't worry about it ultimately boils down to "Trust Google."

    Just because they had justification for doing so doesn't erase the fact that by modifying PageRank, they can make a website effectively vanish - a result indicated by the traffic report.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • RozRoz Boss of InternetRegistered User regular
    edited January 2014
    shryke wrote: »
    Roz wrote: »
    Richy wrote: »
    What I'm understanding from the admittedly brief information presented in the thread is that Rap Genius was abusing Google, by committing some kind of PageRank shenanigans to boost their results, so Google punished them by manually removing them from the results (presumably after warning them to stop and being ignored?) and once Rap Genius straightened up and flew right they got put back in. Is that about right?

    Honestly, that's not evil or scary or anything. In fact it sounds like a pretty fair system to me.

    But more importantly in the context of this thread, it's very much different from what Verizon wants to do.

    Verizon is not planning to go after sites that exploit some kind of flaw in their network. They are planning to go after all sites for money. They plan on selling preferred traffic treatment to sites that pay them to or earn them money somehow, and curb traffic to sites that don't pay as much. The only "abuse" they are going after is the "abuse" of not giving them enough money.

    Verizon is not eliminating Net Neutrality because they want to punish abuse. They want to become the abusers.

    Google already does that to a degree with sponsored links. There's also been points raised about them giving their own systems preference in search. So the difference isn't nearly as wide as you think.

    As for the infrastructure vs. service argument, I'd say that Google has long since passed from the latter to the former. Network effects point out why the "use another search engine" argument really doesn't work - after all, when it comes to search engines, who's left? At best, you have Bing, which remains viable mainly because Microsoft wants its own network service stack.

    The fact is that if Google delists you, you will effectively cease to exist to users. No matter how defensible the Rap Genius decision is, it doesn't change the fact that Google has that power now.

    Except you don't effectively cease to exist. You still exist just fine. The public's ability to reach you remains completely unimpaired. They simply can't search for you using one company's search engine - every other service engine will work just fine. Direct resolution will work just fine.

    If this were in the physical realm, this wouldn't be much different than a very popular guide book to a city removing your diner from their list of top diners to visit. There are other guide books - they may not be as popular - but your diner didn't stop existing. People's access to your diner wasn't impeded. But the end result is the same - less traffic to your diner (website).

    Yes, but if we don't ignore reality, we have to acknowledge that people won't switch search engines and, as the stats linked above show, your site will basically lose almost all it's traffic.

    The reality is that Google can effectively wipe your website from the internet if they so choose. Whether that worries you is a whole other question, but they have the capability.

    I think then you need to expand both your and AngelHedgie's argument from targeting Google directly.

    I think what you guys are arguing is that Search Engines are the parties that are capable of wiping websites off the map, and with that much power, Search Engines act as a core part of the internet infrastructure (alongside ISPs, Registries).

    If you expand your argument a bit to regulate Search Engines collectively, that would seem reasonable to me.

    Roz on
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    When that site exists only to provide lyrics lookup, then yes getting listed lower on the biggest search engine hurts, if only because that is the kind of thing that can be handled by dozens of other sites. People didn't want to "go to rap genius" they wanted to look up some lyrics.

    Besides, I've yet to hear your solution, just more doomsaying. I mean yeah, at some point you have to trust your search engine to provide you with the relevant results. Sorry, but your options are
    1) use an existing search engine
    2) use non-search sites to link crawl your way through the web
    3) operate your own web crawler
    4) not find things

  • AngelHedgieAngelHedgie Registered User regular
    Dunadan019 wrote: »
    Phyphor wrote: »
    Re: giving google systems priority

    Searches for
    email: in order: yahoo, gmail, hotmail, news links, wikipedia
    maps: google maps, news links, yahoo maps, mapquest, bing maps
    books: amazon, B&N, google books, news links
    news: google news, fox news, cnn, yahoo news, news links, cnet, local news site
    search engine: wikipedia, duckduckgo, yahoo, ixquick, bing, dogpile

    Sure, the google thing is close to the top, but not always and everything else I'd expect to see is also there
    khain wrote: »
    Everyone is missing the point of Hedgie's example. The problem is that users cannot find the site via Google, but the solution everyone is suggesting of the user switching to another search engine or accessing the site directly is viable for some users, but the site itself is screwed in most cases because traffic from search engines, and specifically Google, is vital and cannot effectively be replaced.

    The complaint is that Google de-listed a site (which really was probably not de-listed, just not first page) because they attempted to exploit the ranking algorithm. The very ranking algorithm which made Google win vs the half a dozen other search engines back when it launched because it filtered out the spam results better. What is your solution here? Either google can manage its rankings via pruning and re-ranking, or it cannot and we get spam back in the results everywhere

    That particular site may be legitimate, but the method it was using to get ranked higher is the same ones less legitimate sites use, and it was fixed when they stopped doing that. Seems quite reasonable to me

    The decision was justifiable. This time.

    What if Google manipulates results so that the positions they support have preference over the ones they oppose? Would that be okay?

    What if a small startup has a technology that Google wants, so they cripple search returns to weaken them and make them vulnerable to a buyout?

    The core argument for regulating ISPs (which I agree with, by the way) is that if they control the playing field, they can cripple the small guys. My point is that they're not the only ones.

    1) corporations are made up of people, not villains in capes.
    2) it would not be worth the backlash risk if word got out that that was what they did.

    1. We're talking about a company that paid the US government half a billion dollars to forestall a criminal investigation that was very likely to involve their top management.

    2. How would it get out? Google keeps PageRank very close to their vest. We know about Rap Genius because they told us, and they told us because that suited their plans.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Employees would probably leak it if they were doing it in those cases. Who do you think actually implements these things? The CEO himself?

  • AngelHedgieAngelHedgie Registered User regular
    Roz wrote: »
    shryke wrote: »
    Roz wrote: »
    Richy wrote: »
    What I'm understanding from the admittedly brief information presented in the thread is that Rap Genius was abusing Google, by committing some kind of PageRank shenanigans to boost their results, so Google punished them by manually removing them from the results (presumably after warning them to stop and being ignored?) and once Rap Genius straightened up and flew right they got put back in. Is that about right?

    Honestly, that's not evil or scary or anything. In fact it sounds like a pretty fair system to me.

    But more importantly in the context of this thread, it's very much different from what Verizon wants to do.

    Verizon is not planning to go after sites that exploit some kind of flaw in their network. They are planning to go after all sites for money. They plan on selling preferred traffic treatment to sites that pay them to or earn them money somehow, and curb traffic to sites that don't pay as much. The only "abuse" they are going after is the "abuse" of not giving them enough money.

    Verizon is not eliminating Net Neutrality because they want to punish abuse. They want to become the abusers.

    Google already does that to a degree with sponsored links. There's also been points raised about them giving their own systems preference in search. So the difference isn't nearly as wide as you think.

    As for the infrastructure vs. service argument, I'd say that Google has long since passed from the latter to the former. Network effects point out why the "use another search engine" argument really doesn't work - after all, when it comes to search engines, who's left? At best, you have Bing, which remains viable mainly because Microsoft wants its own network service stack.

    The fact is that if Google delists you, you will effectively cease to exist to users. No matter how defensible the Rap Genius decision is, it doesn't change the fact that Google has that power now.

    Except you don't effectively cease to exist. You still exist just fine. The public's ability to reach you remains completely unimpaired. They simply can't search for you using one company's search engine - every other service engine will work just fine. Direct resolution will work just fine.

    If this were in the physical realm, this wouldn't be much different than a very popular guide book to a city removing your diner from their list of top diners to visit. There are other guide books - they may not be as popular - but your diner didn't stop existing. People's access to your diner wasn't impeded. But the end result is the same - less traffic to your diner (website).

    Yes, but if we don't ignore reality, we have to acknowledge that people won't switch search engines and, as the stats linked above show, your site will basically lose almost all it's traffic.

    The reality is that Google can effectively wipe your website from the internet if they so choose. Whether that worries you is a whole other question, but they have the capability.

    I think then you need to expand both your and AngelHedgie's argument from targeting Google directly.

    I think what you guys are arguing is that Search Engines are the parties that are capable of wiping websites off the map, and with that much power, Search Engines act as a core part of the internet infrastructure (alongside ISPs, Registries).

    If you expand your argument a bit to regulate Search Engines collectively, that would seem reasonable to me.

    The thing is that Google is the search engine of record. I would not mind making neutrality regulations applicable to all search engines across the board, but the issue is most applicable to Google, because they have demonstrated that they possess this capacity.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • AngelHedgieAngelHedgie Registered User regular
    Phyphor wrote: »
    When that site exists only to provide lyrics lookup, then yes getting listed lower on the biggest search engine hurts, if only because that is the kind of thing that can be handled by dozens of other sites. People didn't want to "go to rap genius" they wanted to look up some lyrics.

    Besides, I've yet to hear your solution, just more doomsaying. I mean yeah, at some point you have to trust your search engine to provide you with the relevant results. Sorry, but your options are
    1) use an existing search engine
    2) use non-search sites to link crawl your way through the web
    3) operate your own web crawler
    4) not find things

    My solution is simple - search engines are infrastructure, not services, and need to be handed accordingly.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited January 2014
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to algorithmically detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    Phyphor on
  • shrykeshryke Member of the Beast Registered User regular
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Well, what if someone wanted to make a "conservative search engine" that blacklisted liberal websites? Shouldn't that be allowed?

    The choosing of the allowable criteria would be tricky at best, otherwise you risk search engines becoming completely unusable due to spam and the companies that manage them unable to fix it. Well except for one based in not-America that can filter

    I just find it funny that in the NSA threads Hedgie is very skeptical about possible abuse of spying capabilities without any hard proof, but here he is very queasy about what Google can potentially do, also with no proof that it is actually being abused

  • DevoutlyApatheticDevoutlyApathetic Registered User regular
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Nod. Get treat. PSN: Quippish
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited January 2014
    Just remember what searching was like in the 90s before all the ranking/discrimination stuff was done well: check the first 3-5 pages of all of the biggest 4-5 engines and you might find what you want. How many people go past page 2 of just google these days?

    Phyphor on
  • shrykeshryke Member of the Beast Registered User regular
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

  • AngelHedgieAngelHedgie Registered User regular
    Phyphor wrote: »
    Just remember what searching was like in the 90s before all the ranking/discrimination stuff was done well: check the first 3-5 pages of all of the biggest 4-5 engines and you might find what you want. How many people go past page 2 of just google these days?

    That's the point.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • shrykeshryke Member of the Beast Registered User regular
    Phyphor wrote: »
    Well, what if someone wanted to make a "conservative search engine" that blacklisted liberal websites? Shouldn't that be allowed?

    The choosing of the allowable criteria would be tricky at best, otherwise you risk search engines becoming completely unusable due to spam and the companies that manage them unable to fix it. Well except for one based in not-America that can filter

    All depends how you want to structure the rules. I can easily see you being allowed to run a search engine for specific types of searches. But still within other regulatory criteria.

    I just find it funny that in the NSA threads Hedgie is very skeptical about possible abuse of spying capabilities without any hard proof, but here he is very queasy about what Google can potentially do, also with no proof that it is actually being abused

    Except for the demonstration of exactly what it's capable of.

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Phyphor wrote: »
    Just remember what searching was like in the 90s before all the ranking/discrimination stuff was done well: check the first 3-5 pages of all of the biggest 4-5 engines and you might find what you want. How many people go past page 2 of just google these days?

    That's the point.

    ... that people find what they want easily now? That searching actually works?

    Yeah, yeah I know that your point here is that that's all people use now, but meh
    shryke wrote: »
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

    Handwaving. We can't refute "certain criteria" because it's so vague as to be meaningless. Maybe there is a set of criteria that can ensure current search quality, while easing your fears, I don't know! Do you? Otherwise, those words don't mean anything

  • shrykeshryke Member of the Beast Registered User regular
    Phyphor wrote: »
    shryke wrote: »
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

    Handwaving. We can't refute "certain criteria" because it's so vague as to be meaningless. Maybe there is a set of criteria that can ensure current search quality, while easing your fears, I don't know! Do you? Otherwise, those words don't mean anything

    Actually it means alot. "Certain criteria" is how most regulatory regimes work. Stop being deliberately stupid. And especially don't act, as DevoutlyApathetic did, like I didn't say "certain criteria" in the first place.

    You are allowed to fire people based on certain criteria and not others. That's not handwavy or unworkable, it's the fucking law and it works just fine. Same with things like "what kind of behaviour you are allowed to do in the stock market" (see - insider trading)

    You can do the same shit for search engines. "You are not allowed to rank pages based on the following criteria: <list>" Fucking done.

    If you can't think of what those criteria are, you aren't thinking or paying any fucking attention. I'll start you off with an easy one: "based on who owns the company running the website".

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    It means nothing without a criteria list. I'm not the one who is proposing this scheme, nor am I the one worried that some random site isn't going to show up for some arbitrary reason. Sure you could come up with some random list of things: thou shalt not discriminate based on the background colour! But would that address your fears?

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Let's put this differently. Do you have any problems with ranking a site based on its contents, and the links to that site around the web?

  • Salvation122Salvation122 Registered User regular
    Roz wrote: »
    In that case though rap genius was perpetrating page ranking shenanigans, and hand correcting the algorithim is about the only tool in google's box to do something about that.

    Edit: And at any rate, google pretty much used that power effectively to get RG attention.

    And that should make me feel comfortable...why, exactly? A lot of the defense of Google's actions boils down to:

    1. PageRank is their baby to do with as they please, and
    2. Trust Google (even though they refuse to disclose any details of how they make these evaluations).

    I'm sorry, but I'm not comfortable with that.

    And what is your solution? Google should be forced to not be able to manage their own site/results?

    They own PageRank, it's their site. If someone is abusing their algorithm, should they not have recourse against that party?

    And even then, it's not as if they specifically removed the site from the internet (removed its DNS entries or blocked resolution at the IP level). They essentially stopped printing them in their phone book, there are still other phone books out there.

    And what is your solution? Verizon should not be allowed to manage their own network?

    My solution would be "nationalize the network," as it's critical national infrastructure.

    That's not terribly likely to happen though.

  • DevoutlyApatheticDevoutlyApathetic Registered User regular
    edited January 2014
    shryke wrote: »
    Phyphor wrote: »
    shryke wrote: »
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

    Handwaving. We can't refute "certain criteria" because it's so vague as to be meaningless. Maybe there is a set of criteria that can ensure current search quality, while easing your fears, I don't know! Do you? Otherwise, those words don't mean anything

    Actually it means alot. "Certain criteria" is how most regulatory regimes work. Stop being deliberately stupid. And especially don't act, as DevoutlyApathetic did, like I didn't say "certain criteria" in the first place.

    You are allowed to fire people based on certain criteria and not others. That's not handwavy or unworkable, it's the fucking law and it works just fine. Same with things like "what kind of behaviour you are allowed to do in the stock market" (see - insider trading)

    You can do the same shit for search engines. "You are not allowed to rank pages based on the following criteria: <list>" Fucking done.

    If you can't think of what those criteria are, you aren't thinking or paying any fucking attention. I'll start you off with an easy one: "based on who owns the company running the website".

    So when I type in Barrack Obama it can't give preference to his own website? McDonalds website doesn't get preference when somebody types in Mcnuggets nutrional value? In both of those cases the search algorithm is correctly discerning what I actually want.

    It sounds like you might want to think about these burdens you're imposing to remedy a problem that hasn't been shown to exist. "Certain Criteria" is weasel words right around the level of "I'd like a pony" with their relevance to actually doing things.

    DevoutlyApathetic on
    Nod. Get treat. PSN: Quippish
  • Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    How can you treat search engines as infrastructure of the internet?

    The Internet is global, search engines can be hosted anywhere.

  • JuliusJulius Captain of Serenity on my shipRegistered User regular
    Phyphor wrote: »
    Just remember what searching was like in the 90s before all the ranking/discrimination stuff was done well: check the first 3-5 pages of all of the biggest 4-5 engines and you might find what you want. How many people go past page 2 of just google these days?

    That's the point.

    The point is that there isn't a real problem?

  • AngelHedgieAngelHedgie Registered User regular
    shryke wrote: »
    Phyphor wrote: »
    shryke wrote: »
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

    Handwaving. We can't refute "certain criteria" because it's so vague as to be meaningless. Maybe there is a set of criteria that can ensure current search quality, while easing your fears, I don't know! Do you? Otherwise, those words don't mean anything

    Actually it means alot. "Certain criteria" is how most regulatory regimes work. Stop being deliberately stupid. And especially don't act, as DevoutlyApathetic did, like I didn't say "certain criteria" in the first place.

    You are allowed to fire people based on certain criteria and not others. That's not handwavy or unworkable, it's the fucking law and it works just fine. Same with things like "what kind of behaviour you are allowed to do in the stock market" (see - insider trading)

    You can do the same shit for search engines. "You are not allowed to rank pages based on the following criteria: <list>" Fucking done.

    If you can't think of what those criteria are, you aren't thinking or paying any fucking attention. I'll start you off with an easy one: "based on who owns the company running the website".

    So when I type in Barrack Obama it can't give preference to his own website? McDonalds website doesn't get preference when somebody types in Mcnuggets nutrional value? In both of those cases the search algorithm is correctly discerning what I actually want.

    It sounds like you might want to think about these burdens you're imposing to remedy a problem that hasn't been shown to exist. "Certain Criteria" is weasel words right around the level of "I'd like a pony" with their relevance to actually doing things.

    So, let's use a real world example that shows the actual problem - say I search for abortion clinics nearby, and the search returned a list dominated by crisis pregnancy centers instead?

    I don't think the algorithm is returning what I want in that case. And without seeing it, there's no way to know if it's a garbage in, garbage out issue or if the algorithm is biased.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    So what you're saying is that you do not support the efforts to get pirate sites to have their search rank reduced?

    Because that is obviously what people want.

  • AngelHedgieAngelHedgie Registered User regular
    Apothe0sis wrote: »
    How can you treat search engines as infrastructure of the internet?

    The Internet is global, search engines can be hosted anywhere.

    And that means that search engines aren't infrastructure...why?
    Apothe0sis wrote: »
    So what you're saying is that you do not support the efforts to get pirate sites to have their search rank reduced?

    Because that is obviously what people want.

    Thank you for proving my point - the push to remove pirate sites is, in fact, a form of regulation of search engines.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    I don't feel that Google is obligated in any way, shape, or form to show me the results that I think are most relevant. That's what I want from Google. That's what I expect of Google. That is what I believe will make Google the most money, and attract the most users to it. But I do not feel that they owe me that.

    Same with Apple Maps (who I believe was the one suspected of returning odd results like the crisis-counseling centers instead of abortion clinics). They don't return the results I want? Crap. I guess I'd better use another product. It is not Apple's duty to give me the data I want. It's Apple's duty to make money while not breaking the law.

    Silly question: is Rand McNally regulated by the government in order to verify that the information in their road-atlas is correct? What about whoeverthehell owns The Yellow Pages now? Should they be regulated? Does this seem like a fair comparison*?

    I really do not see Google's search-algorithm (or Bing's or Yahoo's or Lycos or whoeverthefuck) as a part of the infrastructure, I think that's going to be my disconnect in this discussion.

    * It seems fair to me, hence my asking. :P

  • Apothe0sisApothe0sis Have you ever questioned the nature of your reality? Registered User regular
    Apothe0sis wrote: »
    How can you treat search engines as infrastructure of the internet?

    The Internet is global, search engines can be hosted anywhere.

    And that means that search engines aren't infrastructure...why?
    Apothe0sis wrote: »
    So what you're saying is that you do not support the efforts to get pirate sites to have their search rank reduced?

    Because that is obviously what people want.

    Thank you for proving my point - the push to remove pirate sites is, in fact, a form of regulation of search engines.

    It means they can't be regulated by the US government, and arguments about "they should be required to do X" become very hard to comprehend.

    That didn't prove your point - there's no regulation, but what you're arguing for is the distortion of search results to not reflect what is most relevant to the interests of those searching. If I'm searching for "download Metallica free torrent" and ITunes gets returned to the top of the list and thepiratebay the bottom, then I've just been sent to the pregnancy crisis centre,

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    ... Im not sure what your point is, but I dont think he just proved it. If anything hes saying that people expect pirate sites to show up, so clearly they still should under the regulation too.

    Without access to the full algorithm (trade secret) and dataset (massive, also contains trade secrets), and the "smart" data which is how things like misspellings and acronyms get matched, there's no way to actually know. And even if the basic algorithm itself is published (beyond the initial paper), what you really need is code access and database access and some experts to review it. Which will never happen. A lot of these things would be done as part of a data-driven system, it's not going to rewrite "abortion providers" into "pregnancy crisis centers" in the code or the algorithm. Instead there's going to be a database which will contain (abortion + providers) -> "pregnancy crisis centers" and an incoming query will be parsed and sent to this, which will do the lookup and rewrite it accordingly (PA -> penny arcade is one possible example). So the algorithm would be "based on a database of known misspellings, abbreviations and common searches rewrite the query to contain terms that are more likely to have a better match" and the code would tell you nothing, other than some query is being matched and then substituted. You would really need live access into the internal databases

    Besides, if anything I would expect a republican us government to be far more likely to regulate abortion provider searches away than Google would

  • DevoutlyApatheticDevoutlyApathetic Registered User regular
    shryke wrote: »
    Phyphor wrote: »
    shryke wrote: »
    shryke wrote: »
    Phyphor wrote: »
    This may shock you, but all search engines possess this capability, or at least the ability to completely eliminate a result, if they choose to do so

    I'm not sure how you can regulate "neutrality" for search.

    Network neutrality is simple - deliver the packet, regardless of destination, but search is very complicated because you can't give all of the things, the very concept of searching involves ranking and filtering. How do you be "neutral" to a site that spams results. Is listing wikipedia at or near the top "neutral?" How should you handle cases of abuse and exploitation of the system that are hard to easily detect?

    What would your regulations be?

    Also, are all search engines infrastructure? Or only the big ones?

    All of them. Why not?

    How do you regulate it? The same way you do any of these things. You are allowed to discriminate/rank based on certain criteria and not others and the government keeps an eye on you to make sure you do that.

    Discrimination/rank basing is the heart of search engine tech. That is the entirely of their value add that you're trying to hand wave away. This sounds like a fantastic way to make all search engines shitty and incapable of improving because of regulation.

    Which is why I said "certain criteria". You know, the thing that exists in the sentence you didn't read to actually give it it's full meaning.

    Handwaving. We can't refute "certain criteria" because it's so vague as to be meaningless. Maybe there is a set of criteria that can ensure current search quality, while easing your fears, I don't know! Do you? Otherwise, those words don't mean anything

    Actually it means alot. "Certain criteria" is how most regulatory regimes work. Stop being deliberately stupid. And especially don't act, as DevoutlyApathetic did, like I didn't say "certain criteria" in the first place.

    You are allowed to fire people based on certain criteria and not others. That's not handwavy or unworkable, it's the fucking law and it works just fine. Same with things like "what kind of behaviour you are allowed to do in the stock market" (see - insider trading)

    You can do the same shit for search engines. "You are not allowed to rank pages based on the following criteria: <list>" Fucking done.

    If you can't think of what those criteria are, you aren't thinking or paying any fucking attention. I'll start you off with an easy one: "based on who owns the company running the website".

    So when I type in Barrack Obama it can't give preference to his own website? McDonalds website doesn't get preference when somebody types in Mcnuggets nutrional value? In both of those cases the search algorithm is correctly discerning what I actually want.

    It sounds like you might want to think about these burdens you're imposing to remedy a problem that hasn't been shown to exist. "Certain Criteria" is weasel words right around the level of "I'd like a pony" with their relevance to actually doing things.

    So, let's use a real world example that shows the actual problem - say I search for abortion clinics nearby, and the search returned a list dominated by crisis pregnancy centers instead?

    I don't think the algorithm is returning what I want in that case. And without seeing it, there's no way to know if it's a garbage in, garbage out issue or if the algorithm is biased.

    First off, your inherent bias is showing that people who search for abortion providers don't also often display an interest in crisis pregnancy centers. Regardless, no scurrilous motives need to be attributed to this, this is behavior that could be data driven by the overlapping interests. Second, that's when you find a new search engine, when you tell your friends about this shit, you take to twitter, and so on and so forth. The product clearly failed to deliver and you should seek a new product from the vast array of other choices.

    Finally, if I had to lay a wager on whether government control over corporate control would make your scenario more or less likely I would pretty much always say that it'd be the government pushing towards no abortion centers showing up. Can you honestly say otherwise?

    Nod. Get treat. PSN: Quippish
  • SummaryJudgmentSummaryJudgment Grab the hottest iron you can find, stride in the Tower’s front door Registered User regular
    edited January 2014
    Phyphor wrote: »
    It means nothing without a criteria list. I'm not the one who is proposing this scheme, nor am I the one worried that some random site isn't going to show up for some arbitrary reason. Sure you could come up with some random list of things: thou shalt not discriminate based on the background colour! But would that address your fears?

    In the surveillance thread, everyone on the anti-NSA side got shit on because "you can't just come in here and say these things and attack the status quo without proof!"

    In this particular horse-race, apparently the rules are different.

    "Google is stepping way out of bounds by moderating abuse of their search results! Those search results are now infrastructure and should be protected from any kind of Google-side moderation, because people rely so heavily on those search results...because they're cleaned and moderated. Oh my God, they even re-listed the site on their index after they removed the abusive SEO content. Those bastards!"

    EDIT:
    Mill wrote: »
    Hedgie's example may not be the best, but it does sort of illustrate why it's a bad thing to not have net neutrality because the ISPs could easily cripple anyone for either not giving them enough money (fucking over the little guys) or just not liking them (again fucking over the little guys, who won't have the resources to fight such bullshit). I'm still mystified that there doesn't seem to be any motivation to treat broadband access like how we treated phone and electric access several decades ago (aka "fuck your profits, you'll run that utility out").

    This is a great explanation. The Google example is a particularly bad way to illustrate it because, for starters, Google isn't an ISP. (The Software & Search division, at least, before someone clever mentions Kansas City. That's a second bridge we're crossing before we've gotten there.)

    SummaryJudgment on
  • shrykeshryke Member of the Beast Registered User regular
    edited January 2014
    Phyphor wrote: »
    It means nothing without a criteria list. I'm not the one who is proposing this scheme, nor am I the one worried that some random site isn't going to show up for some arbitrary reason. Sure you could come up with some random list of things: thou shalt not discriminate based on the background colour! But would that address your fears?

    Sorry, I assumed you were being serious and thus the criteria list would be self-evident.

    I mean, we are talking about net neutrality so "Though shalt not rank sites based on their connection to your multi-national company" and "Though shalt not rank sites based on protection payments" would be obvious. I'll try to dumb it down in the future.

    So when I type in Barrack Obama it can't give preference to his own website? McDonalds website doesn't get preference when somebody types in Mcnuggets nutrional value? In both of those cases the search algorithm is correctly discerning what I actually want.

    It sounds like you might want to think about these burdens you're imposing to remedy a problem that hasn't been shown to exist. "Certain Criteria" is weasel words right around the level of "I'd like a pony" with their relevance to actually doing things.

    What the hell? Barack Obama owns a search engine now?

    I'm saying "If you use Bing and search for 'Playstation', the search engine shouldn't return Xbox links instead because MS set their search engine up in such a way as to push their own company's links to the top of the results".

    Exactly the same way net neutrality says "Comcast isn't allowed to cut down bandwidth for websites not owned by them in favour of ones that are owned by them".

    Search Engines are a vital component of the internet's infrastructure to the same extent as the lines that carry the information. They can kill traffic to your website as easily as an ISP can. And since net neutrality is all about keeping that kind of shit from happening, regulation of search engines is a rather obvious extension of the philosophy, to the point where it doesn't make sense to not see it as a goal.

    shryke on
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Oh so things that aren't happening, unless you count ads or the links to everything at the top of the page. Got it

Sign In or Register to comment.