MySQL 是否可以在没有数据库访问权限的情况下从任何网站获取数据库内容?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/3956515/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-31 17:28:35  来源:igfitidea点击:

Is it possible to get database contents from any website without having DB access?

mysql

提问by Deepak

I want to retrieve the database contents from a website and use it in another website. Is it possible to access the database contents using any scripts ? There are totally 500 + products in the website and they are going to remove those contents now. So before they remove them i want to download all the data and host it in my website. It is not possible to visit each and every page and write down the products information. It will be helpful if I get any kind of script to access the database and download all of them in a specified format.

我想从一个网站检索数据库内容并在另一个网站中使用它。是否可以使用任何脚本访问数据库内容?网站上共有 500 多种产品,他们现在将删除这些内容。因此,在他们删除它们之前,我想下载所有数据并将其托管在我的网站中。不可能访问每个页面并写下产品信息。如果我获得任何类型的脚本来访问数据库并以指定的格式下载所有这些脚本,那将会很有帮助。

For eg: We have leech software for forums to get content from another forums.

例如:我们有论坛软件可以从其他论坛获取内容。

回答by Shoban

No!!!!The question is not whether it is possible or not. It is whether you are allowed to copy from the site or not.

不!!!!问题不在于它是否可能。这是您是否被允许从站点复制。

You cannot connect to the databse without the details.

没有详细信息,您无法连接到数据库。

Why don't you contact the website owners to send you the data they are going to remove it anyway?

您为什么不联系网站所有者,将他们无论如何要删除的数据发送给您?

回答by chrisaycock

I'm assuming you don't have administrative privileges to their site, in which case you can't just issue queries to their database. Their web page acts as a front-end to the database (and their business logic) thus preventing users from ever interacting directly with the data. That set-up is intentional.

我假设您对他们的站点没有管理权限,在这种情况下,您不能只向他们的数据库发出查询。他们的网页充当数据库(及其业务逻辑)的前端,从而防止用户直接与数据交互。这种设置是有意的。

The leech software you mention is generally implemented as a scraper of sorts. It downloads the contents of a forum as just a webpage, no different from how a search engine would crawl the Internet. If need be, the leech might store/mimic a cookie in case the forum requires a logged-in user. But in no case does the leech access the site's underlying database.

您提到的水蛭软件通常是作为各种刮刀实现的。它将论坛的内容下载为一个网页,这与搜索引擎抓取 Internet 的方式没有什么不同。如果需要,水蛭可能会存储/模仿 cookie,以防论坛需要登录用户。但在任何情况下,水蛭都不会访问站点的底层数据库。

So your choices are to either write your own crawler, or email the site's administrator (as others have suggested). I'm not going to get into the issues of whether what you're asking for can be considered fair use; you should any legal ramifications before you attempt scraping.

因此,您的选择是编写自己的爬虫程序,或者向站点管理员发送电子邮件(正如其他人所建议的那样)。我不会讨论您所要求的内容是否可以被视为合理使用的问题;在尝试抓取之前,您应该了解任何法律后果。

回答by Earlz

No, it is not possible. I would recommend trying to get in contact with the site owner and explain why you want their database and such and they may let you have it. Otherwise your best bet is to crawl the site and get a copy of every page and then manually populate(or attempt to automate it somehow) your own database from it.

不,这是不可能的。我建议尝试与网站所有者联系并解释为什么您想要他们的数据库等,他们可能会让您拥有它。否则,您最好的办法是抓取该站点并获取每个页面的副本,然后从中手动填充(或尝试以某种方式使其自动化)您自己的数据库。

回答by klox

if you have any permission to that sites or you are administrator of that site, may be you can SQLdumb or any package for dumping data. But if have not a permission you cant do that.

如果您对该站点有任何权限或者您是该站点的管理员,那么您可以使用 SQLdumb 或任何用于转储数据的包。但如果没有许可,你不能这样做。