- Docs
- Sharing & Collaboration
- Public Hosting (Web Hosting)
- Managing Public Hosting
Managing Public Hosting
This article covers how failed access attempts are logged and billed, the limitations around server-side scripting and custom domains, and how to control search engine indexing through password protection and robots.txt.
Logging
Public Hosting logs capture the requests related to publicly served files, including failed requests. A request may fail because you've enabled authentication for the publicly hosted folder and the visitor does not provide the right credentials. An attempt to access a path that does not exist on your public hosting domain will fail. Failed requests are included in your billable API calls.
Other file operations that occur in publicly shared folders are captured in their respective logs. For example, operations by logged-in users are included in History Logs, and downloads via share links are found in Share Links Usage Logs.
Search Engine Crawlers
Files.com automatically publishes a standard robots.txt
file for the hosting domain to indicate that automated crawlers should not access your public hosted URLs. While this is generally respected by many search engines, it does not guarantee that all crawlers respect this.
You can configure each Public Hosting folder to Require a password to access. When a password is enabled, then web crawlers will be blocked, and your files cannot be indexed by search engines.
Configuration Limitations
Files.com does not currently support other configuration options, such as custom HTTP headers.
You cannot use Public Hosting for serving sites that rely upon server-side scripting, such as PHP, .NET, JSP or database actions.
Basic HTTP authentication is the only form of authentication available with Public Hosting.
Custom domains do not apply to Public Hosting URLs, but you can use edge platforms with public hosting to meet that need.