DDOS any website with Google Spreadsheet
Google uses its "spider" FeedFetcher to cache any content in the Google Spreadsheet inserted via the formula =image("link") .
For example, if one of the table cells inserts the formula
=image("http://example.com/image.jpg")
Google will send a FeedFetcher spider to download this image and cache it for further display in the table.However, if you add a random parameter to the URL of an image, FeedFetcher will redownload it each time. Let's say, for example, there is a 10 MB PDF file on the victim's website. Inserting such a list into a table will cause the Google spider to download the same file 1000 times!
=image("http://targetname/file.pdf?r=1")
=image("http://targetname/file.pdf?r=2")
=image("http://targetname/file.pdf?r=3")
=image("http://targetname/file.pdf?r=4")
...
=image("http://targetname/file.pdf?r=1000")
All this can lead to the exhaustion of the traffic limit for some site owners. Anyone using just a browser with one open tab can launch a massive HTTP GET FLOOD attack on any web server.
The attacker doesn't even need to have a fast channel. Since the formula uses a link to a PDF file (that is, not to an image that could be displayed in a table), the attacker receives only N/A in response from the Google server . This makes it quite easy to amplify the attack many times over [Analogue of DNS and NTP Amplification - approx. translator] , which poses a serious threat.
Using a single laptop with multiple tabs open, simply copy-pasting lists of links to 10MB files, the Google spider can download this file at over 700Mbps. In my case, this went on for 30-45 minutes until I shut down the server. If I calculated everything correctly, it took about 240GB of traffic in 45 minutes.
I was surprised when I saw the amount of outgoing traffic. A little more, and I think outgoing traffic would reach 1 Gb / s, and incoming 50-100 Mb / s. I can only imagine what would happen if multiple attackers used this method. The Google spider uses more than one IP address, and although the User-Agent is always the same, it may be too late to edit the web server config if the attack catches the victim by surprise. Due to its ease of use, an attack can easily go on for hours.
When this bug came up, I googled incidents with it, and found two:
• An article where a blogger describes how he accidentally attacked himself and got a huge traffic bill. [ Translation on Habré - approx. translator] .
• Another article describes a similar attack using Google Spreadsheet, but first, the author suggests parsing links to all site files and launching an attack against this list using multiple accounts.
I find it somewhat strange that no one thought to try adding a random parameter to the request. Even if the site has only one file, adding a random parameter allows thousands of requests to that site. Actually, it's a little scary. Simply pasting a few links into an open browser tab shouldn't cause this.
Yesterday I sent a description of this bug to Google and received a response that this is not a vulnerability and does not qualify for the Bug Bounty program. Perhaps they knew about it in advance, and really do not consider it a bug?
However, I hope they fix this issue. It's just a bit annoying that anyone can get the Google spider to cause so much trouble. A simple fix is to download files skipping extra parameters in the URL [In my opinion, this is a very bad fix. Perhaps it is worth limiting the bandwidth or limiting the number of requests and traffic per unit of time and at the same time not downloading files larger than a certain size - approx. translator] .
Немає коментарів:
Дописати коментар
Залишити коментар тут!