Score/PornMegaLoad Metadata

All links below are NSFW!

The Score Group has released a large amount of it’s own content for over 20 years which means that they have an extensive catalogue of material available across a number of different websites as can be seen here.

The issue with their metadata is their need to recycle older content and present it as newer.

Easiest example would be showing how they also crosspost the same scenes across multiple websites showing one as new and the other with still the original release date

URL’s are constructed in 3 parts: Site/model/production number

For the Haley Reed scene the URL is https://www.18eighteen.com/xxx-teen-videos/Haley-Reed/52416/

As it currently stands the lastest new release on 18eighteen is https://www.18eighteen.com/xxx-teen-videos/Mila-Pie/77491/

As we can see the production number is significantly different and highlights this scene has actually been recycled with a newer date and in most cases at least some different metadata.

The next step is to go and look back on Wayback to find the original release details, in the case of the Haley Reed scene we can see that it has a completely different title on original release: 18eighteen - Face Painting Art - Haley Reed (23:40 Min.).

Thankfully Wayback seems to have done a good job capturing snapshots over the years!

Cover Images

The Cover images shown on the scene listings are different from the images shown on the individual scene page.

The listings page are showing what in most cases on older scenes is the original scene image but on individual pages they are showing a number of screenshots from the scene and then picking one to be the “cover”.

The image currently shown on the scene page is:

https://cdn77.scoreuniverse.com/modeldir/data/posting/52/416/posting_52416_x_1920.jpg

The correct cover image is:

https://cdn77.scoreuniverse.com/modeldir/data/posting/52/416/posting_52416_1920.jpg

By removing _x from the url it’ll show you the correct cover image.

Two further things to note here are that the code is the same production code as the scene so is easy replaceable. If the code is only 4 digits long, the first part is only 1 digit followed by 3 as seen here: https://cdn77.scoreuniverse.com/modeldir/data/posting/6/889/posting_6889_xl.jpg

With a site that’s been around a while and doesn’t have any time of set standards, not all images are available in HD. The following sizes are available, just replace the end part of the file name until you find a valid size!

1920x: _1920.jpg
1600x: _1600.jpg
1280x: _1280.jpg
800x: _800.jpg
600x: _xl.jpg
450x: _lg.jpg
225x: _med.jpg
100x: .jpg (No size info = a tiny image!)

As I originally put on Github a long time ago “Whilst width is consistent, height is variable depending on source as we’re dealing with a highly consistent organisation!”

1 Like

@Ronnie711 all things being equal, should we prefer scraping scenes from the sub-studios (e.g., 18ighteen.com) over scraping from the network platform (PornMegaLoad)? As you know, they will yield different metadata results, like studio codes and sometimes cover images.

My personal opinion is for direct sub-studio scraping, since the scenes are referencing them as the studio, and somewhere over the years I got the impression from others that this was the proper way to do it. But I’d like to know what you and any one else who familiar with this content thinks.

I’d be nice to hammer out a clear preference in a revised edit of your post, just for consistencies sake and to reference something.

So, sub-studios is preferable as PML is essentially ‘ScoreHub’. I subbed recently to see what’s what & PML only has a limited library available. Essentially if it hasn’t been released in 1080/4K then it won’t be on PML.

PML is also unreliable in that they seem to be recycling the recycling more often - for some reason scenes from 2020 need repackaging already even though they were originally released in 4K …

Personally, PornMegaLoad, ScoreVideos and ScoreHD should be the ‘Studio of last resort’ for metadata as for the most part Scenes are/were originally available on a substudio somewhere …

Thanks for the quick reply @Ronnie711. Good to know that I’m on the same page as others who are more familiar with the networks content.

Would you mind editing your first post to briefly include your rationale for scraping directly from the sub-studios rather than the hub platforms? I think it would be useful since people will probably use your post as a reference or guide.

1 Like

I’ll have a think over the weekend of how to word the ‘strong suggestion’ as it’s worthwhile formulating in an evidence based format rather than dumping it in as what will come across as ‘My opinion’ :rofl:

1 Like