Hey there. I'm the PM for Pages. That's fair feedback (though a lot could have changed even in the past few weeks, so I would recommend taking another look)
- It's hard for me to tell what's causing the issue (the build logs should give an idea of at least at which stage). One thing we've added over the course of the beta is connection to GH comments, which requires a new type of permission. So that might be causing the failure, but also, failed builds should result in a comment (and email) for notifications, depending on your GH settings. All of that said, my email is rita at cloudflare if you want to forward me more details, I can help look into it.
- Creating a canonical notion of your domain is something we're thinking about. Right now the preview links will set X-Robot-Tag so they can't be crawled but yeah, ideally we want to give you a way to remove myproject.pages.dev if you're not using it
- The cache-control settings are intended to keep your site from serving stale content by relying on etags — your browser should still serve from cache if the asset hasn't been updated. On the Cloudflare site, we have much longer TTLs.
Anyway, thanks for trying Pages! We're continuing to iterate on it, even after GA so should only be getting better :)
It is great that pages.dev is now GA: That happened f.a.s.t! Love the redirect feature; excited about the upcoming integration with Workers, KV, and Durable Objects.
The only thing missing is different build commands per branch and ability to exclude certain branches from builds altogether.
jam.dev are good friends of ours (both founders ex-cloudflare and some of my good personal friends!) :) but was thinking about it more as a play on jamstack.
re: excluding branches — we're actively looking into it!
and yes, that error has been fixed AFAIK, but let me double check.
It would be great to add "clean urls" - removing trailing slash for directory urls, like in firebase:
```
"hosting": {
// ...
// Removes trailing slashes from URLs
"trailingSlash": false
}
```
I have also a lot of problems with builds.
I tried 2 ways od deploying static pages to Cloudflare:
1) gatsby build
2) fully static page (gatsby build on my local computer and push "public" dir to github repo
both ways are generating error:
08:35:00.770 Finished
08:35:01.232 Deploying your site to Cloudflare's global network...
08:54:16.759 Failed due to an internal error
it looks like a hidden limit - build timeout 20 minutes?
My website has 13000 small files (html, js, css, png, webp - max file size is about 300kB)
and I cant deploy it because of this error
I even limited image sizes from 1600px to 800px, and limited files to about 10000 files and still I get error
I have another website with 11000 files and I use "gatsby build" - and build is successful for 1 of 5 tries...
To add to this, when I was trying out CFP, I found it annoying that there was no explicit "build" button. I changed the default branch that I wanted CF to build from, but it didn't automatically make a new build. And there was no button I could click (like there is on Netlify) to say "rebuild now".
+1 for this. Netlify has it and I have used it many times to redeploy a site after changing some config settings (e.g. env vars, build commands). Having git push as the only deployment trigger is a pain in these scenarios.
Do you plan to improve Cache-Control to make cache busting possible? Otherwise, even if the browser has the resource cached, there will still be a round trip to revalidate.
- It's hard for me to tell what's causing the issue (the build logs should give an idea of at least at which stage). One thing we've added over the course of the beta is connection to GH comments, which requires a new type of permission. So that might be causing the failure, but also, failed builds should result in a comment (and email) for notifications, depending on your GH settings. All of that said, my email is rita at cloudflare if you want to forward me more details, I can help look into it.
- Creating a canonical notion of your domain is something we're thinking about. Right now the preview links will set X-Robot-Tag so they can't be crawled but yeah, ideally we want to give you a way to remove myproject.pages.dev if you're not using it
- The cache-control settings are intended to keep your site from serving stale content by relying on etags — your browser should still serve from cache if the asset hasn't been updated. On the Cloudflare site, we have much longer TTLs.
Anyway, thanks for trying Pages! We're continuing to iterate on it, even after GA so should only be getting better :)
(edit: spacing :P)