Une nouvelle faille de sécurité a été reportée hier , la faille touche Google service et plus précisément l’outil de demande de suppression de page Web “the removal of websites tool” .
Il s’agit d’un simple listing d’un dossier non protégé et de toute son arborescence, le risque est faible mais le dossier peut contenir des informations utiles pour un black hat .
A mon avis s’il ne s’agit pas d’une faille dans le serveur lui même, cela ne va pas être intéressant pour un pirate, sinon les failles dans les sites et les services web ne permettent pas vraiment de faire grand chose ,
Et voici un exemple de fichier trouvé dans leur serveur nommé config.txt
# Properties file for urlremover application
# Copyright 2000 and onwards, Google, Inc.
# Maintained by sanjeev@google
#
# Note: development settings are maintained in config.txt, remember to
# update that file in conjunction with this one.# General App settings
# Front door name (needed for embedding urls in emails)
frontDoor=http://services.google.com:8882/urlconsole/controller
# How long we wait before timing out the session
# e.g. so if a user stays on a single page longer
# then this amount of time he will be sent to the login
# page again.
SESSION_timeout_minutes = 5
# Client IP Blacklist file
bannedNetworks = /apps/com/google/urlremover/badip.txt
# Proxy settings
proxySet = true
proxyHost = proxy
proxyPort = 80
userAgent = googlebot-urlconsole
# Database stuff
DBDriver = org.gjt.mm.mysql.Driver
DBUrl = jdbc:mysql://localhost/dbRemoveUrl
DBLogin = root
# put password in before the push
DBPassword = k00k00
# Publisher stuff
# ackQueue and outputQueue must already exist. Use
# google/setup/pcqueue.py create
pathofqueue> 128
# to create them. Also,
# //depot/ops/production/master/files/etc/cron.hourly/dynamic_gws_data_push>
# must agree with us on where the queues are located.
ackQueue = /apps/publish/publish_ack_queue
ackDir = /apps/publish/publish_ack
outputDir = /apps/publish/current
outputQueue = /apps/publish/publish_queue
BadAll = badurls_autonoreturn
BadSnippet = badurls_autonosnippet
BadCache = badurls_autonocache
BadMsgids = autobadmsgids
Porn = badurls_autoporn
BadImages = badurls_autoimage
ImageTweak = badurls_autoimagetweak
BadOdp = badurls_autonoodp
BadDemoteGws = badurls_autodemotegws
BadSpam = badurls_autospam
BadSupplemental = badurls_autosupplemental
maxChecks = 15
lastPushFile = TIMESTAMP
sendEmailDelayMillis = 3000
# warn if no push within last 400 minutes
noPushWarning = 24000000
# If this is > 0, then no database changes are made, no urls are fetched,
# and no emails are sent.
readOnly = 0
# Where to store the Publisher’s temporary disk maps
diskMapDir = /export/hda3/tmp
# I18N
i18nText = com.google.urlremover.I18N.text
langSupport = en-us,fr,de,it,ja
# Hit Rate Warnings
# The settings below will cause a warning email to be generated
# if we encounter >200 page hits in a 30 minute window.
# Note: the window does not slide
rateCheckIntervalMillis = 1800000
rateMaxHitsPerInterval = 800
# Request Status Display
RequestStatusCutoff = 10
# General limits
maxUrlLength = 511
maxRobotsLines = 100
# We give people 24 hours to verify their email or we expire their accounts
oldestUnverifiedMillis = 86400000
# New user email stuff
NewUserEmailFrom = url-remove@google.com
# Publisher email stuff
publisher.EmailFrom = url-remove@google.com
# Error emails stuff
# Note these are sent to internal engineers, no users see these
EMAIL_error = mstanton@google.com
# General email stuff
EMAIL_smtp = smtp
EMAIL_from = url-remove@google.com
EMAIL_name = url remover application
# General request expiry time = 180 days = 180 * 24 * 60 * 60 * 1000
expiryOffsetMillis = 15552000000
# Max time we’re going to wait before getting a response from a site
# for the site down authentication method
# currently 360 secs, since Squid might take that long to time out DNS
maxWaitMillis = 360000
# max number of outstanding fetches (in parallel)
fetchParallelism = 50
# Authenticator config
robots.shellCommand = /apps/bin/robots_unittest –patterns –agents=Googlebot,Google
robots.shellCommandImage = /apps/bin/robots_unittest –patterns –agents=Googlebot-Image
metaTags.shellCommand = /apps/bin/ripper
–datadir=/apps/smallcrawl-data
–cjk_config=/apps/BasisTech
–logtostderr –stdin –robotsmeta
# patterns that should not be removed via robots.txt
robots.noremove = /home/google/googlebot/bypass_robots.pat
Reste à savoir qu’il n’existe que deux ordinateurs qui est impossible de pirater : 1 - un ordinateur éteint, 2 - un ordinateur sans carte éseau !
Recent Comments