image stats
rating
3.63
votes
76
views
2104
uploader
woberto
comments
14
date added
2025-05-14
category
None Yet
previous votes
Loading..
facefuck
1 star2 stars3 stars4 stars5 stars
facefuck

"a screenshot of a social media page"

Rate image:
[ | | ]
[ | ]

uploader: woberto
date: 2025-05-14
Comments for: facefuck
pulse Report This Comment
Date: May 14, 2025 09:19AM

Bet you've never seen plus613 tracking you.
Anon - not logged in Report This Comment
Date: May 14, 2025 09:51AM

That good are you?
woberto Report This Comment
Date: May 14, 2025 10:45AM

Do you get stat's anymore pulse?
Simple crap like referrals, previous website, next website, bookmarks etc.
Some webpages took forever to download because they were uploading everything from your browser.
Now the booble anal-itics stuff goes way beyond all that so I guess webpages like that are long gone.
pulse Report This Comment
Date: May 14, 2025 12:24PM

I'm sure there's an amount of analytics I could get from CloudFlare, given it proxies all the queries to the site and uses its own caching proxies so many items I don't even see hit the server.

But then I have an array of caching proxies at the WAF side; so a lot of the stuff that does come in from cloudflare STILL doesn't hit the core where I could look at the logs.. Database lookups will have to come in, but basically anything that's static (PHP template content, images etc) are 60% cached by cloudflare, 30% cached by my own front end, and the remaining 10% makes it through because the caches have expired and needs to be re-checked.

The amount of useful logs that hit the back end is pretty minimal. But yeah CF I guess has a lot of that data. Probably even more if I paid for it.

Long story short; the logs are smeared over a dozen different systems and I don't care that much. The only REALLY central place to check is cloudflare but I don't look because.. I still don't care smiling
smiley
pulse Report This Comment
Date: May 15, 2025 11:56PM

Quote
anon
That good are you?

That's why I wear sneakers. For sneaking.
woberto Report This Comment
Date: May 17, 2025 02:08AM

I just love stat's I guess.
My three letter unix passwords have done me fine for 25 years now.
But outside of that I am fairly paranoid, even if it is for no good reason.
I still type all web addresses, usernames, passwords and even phone numbers (except for work devices). I can memorise 50 logins and phone numbers for my personal life but work not poss8ble so that's all saved on devices.
pulse Report This Comment
Date: May 17, 2025 06:07AM

I don't do it anymore, but up until probably 2010 the only reason I used the contacts in my phone was so caller ID would display who was calling at a glance.

Prior to that, you'd tell me your # and I'd remember it. Was amazing at it. Every time I'd ring somebody I'd just dial the number. I still remember the home #s of every one of my friends, incl primary school ones. I was lightning fast on my 8110.

These days I don't really bother anymore. I remember important ones, but I don't even try. Passwords are similar; I've got some slack passwords on stuff that just doesn't matter, same ones for 30 years; or it's behind 4 layers deep security or whatever.. but work effectively requires the use of a password manager because they require eg 20+ character passwords. It's pointless trying to remember it, back to post-it notes.
pulse Report This Comment
Date: May 18, 2025 09:52AM

Stats from Cloud Flare for previous 24h from ~an hour ago. All is what they have cached; I cache a lot more on top of what CF does.









Edited 1 time(s). Last edit at 18/05/2025 09:52AM by pulse.
woberto Report This Comment
Date: May 18, 2025 09:33PM

Kewel.
How is bandwidth calculated?
Is the web server proviging that bandwidth or is it accumulated traffic?
Or goats?
pulse Report This Comment
Date: May 19, 2025 01:13AM

I'm not 100% sure, but I THINK that's just images going to the internet. It definitely doesn't include any of the back end data transfers.

Requests come in through CloudFlare, which does some security stuff and has a large cache engine for images etc. I can't remember what the 'time to live' for the CF cache is, could be an hour or two (eg if 2 people view an image within an hour, it's fed from CF instead of requesting it back to the image farm). The page code will also tell your browser to cache some of it itself (site logo, some of the CSS data and stuff like that, static assets). CloudFlare is massive, has insane bandwidth and has caching pools all over the world, so chances are wherever you're viewing from, any cached data from CF has loaded very quickly because it's come from close to you.

Anything that passes that layer without being fed directly from CF are then forwarded to a group of caching reverse proxies, with a much higher TTL (7 days I think). That's really just caching for images, which don't change. I haven't managed to troubleshoot this yet, but if you upload an image and immediately go to the gallery, sometimes there's no thumbnail for the image yet; that's because there's a cache miss but it doesn't forward the request properly back to the web server. Reload the page, and it's there. That pool is on multiple 10Gb (shared, obviously) links to CF/internet.

Anything that passes THAT layer (this will be HTML requests, as the pages are dynamic, or images that haven't been looked at in a while) will forward to the front end web servers, which then decides what to do with the request. 1Gb fibre connection.

If it's for an image, it's straight to the image pool to provide it back (feeding each layer above in turn, so it'll be cached on the way through both by the front end, and by CF).
If it's for HTML/PHP it goes to the PHP accelerators, which generate the page dynamically, and talk to the database cluster at the back end, generate the page and feed it back through the stack but it doesn't get cached, so every time you view the page it's generated by the back end.

All the back end is interconnected at 40Gb, and all data is stored on NVME flash RAID arrays so it's normally pretty fast. There's a lot of caching in front of the slowest link in the chain, but with that said that link is still 10x faster than what used to run everything when it was in Florida. The reality is a picture site doesn't require much anymore.

We're a long, long way from when we started the site and just couldn't keep up and were building hack upon hack to try to handle things. We were actively kicked off hosting companies for being too big!

Oh, and it still does nightly backup back to Florida because I kept 1 system there.
quasi Report This Comment
Date: May 19, 2025 11:31AM

You're just one hurricane away from losing that backup.
pulse Report This Comment
Date: May 19, 2025 12:18PM

Haha. Oh well, these things happen.

I have local backups too, so long as we don't have a hurricane in Melbourne, Australia and Melbourne, Florida at the same time, we'll be fine.
woberto Report This Comment
Date: May 20, 2025 08:06AM

Shit pulse, you just tempted fate.
My (2nd) favourite website went bye bye in this...
[www.bleepingcomputer.com]
pulse Report This Comment
Date: May 20, 2025 12:06PM

Haha. Well I'm sure you miss that chess website sad
smiley

It would be unfortunate for any of those locations to burn down. But I'd be far more annoyed with one than the other.

It should actually be possible for me to live flip the site between Melbourne-based and Florida-based hosting and aside from about 60 seconds downtime (and the loss of any changes since the nightly snapshot) you wouldn't notice. That's my 'disaster recovery' strategy - wing it. I have also stretched the local L2 network between the 2 countries tunnelled over a VPN interface. Performance is terrible. L2 vlan should be within sub-ms response time, and I think it's about 180ms return trip time.. so it's all a bit shit. But it does work.

With that said, the site would be the least of my worries if I lost all my equipment.