Jimmycrackcrack

joined 1 year ago
[–] [email protected] 4 points 22 hours ago

I ended up at the practice after I first started cooking for myself and didn't think to do this and wondered why the carrots were so unpleasant. The peel is just too... carrotty. It's just super intense carrot taste to the point of unpleasantness, also even with a good wash it kind of tastes like dirt. I only really like it when it's those little carrots sometimes referred to as 'dutch carrots' and they're roasted so you get some blackened char on that skin.

[–] [email protected] 1 points 2 days ago

There's windows only laptops?

[–] [email protected] 3 points 2 days ago

You have my vote for your interpretation, that had always been my understanding too.

[–] [email protected] 2 points 3 days ago

Fucking love halloumi

[–] [email protected] 13 points 3 days ago* (last edited 2 days ago)

Hopefully in a year or two they'll eventually just call it Twitter or maybe if we're lucky it will go out of business and then they'll probably still just call it Twitter because the X thing would then have just been a short lived portion of its overall lifespan.

[–] [email protected] 2 points 3 days ago

Knowing my memory I'd forget it all very soon after it happened and need a history book to help me recall any of it and the stuff left out or distorted would end up warping that recollection enough that it'd be so unreliable I may as well believe the historians. I can scarcely remember the previous day as it is.

[–] [email protected] 5 points 5 days ago (2 children)

You might want to try soft claws. They're little plastic caps in the shape of a cat's nail (except soft not pointed) that you fill with a non-toxic adhesive that comes with the pack and attach to the cat's nails. Over time the cat naturally sheds their nails and the nail caps come off with them and you just replace them again when that happens. They say it doesn't bother the cats and they don't even notice but I think they're exaggerating a bit there because my cats hate the process of having the nail put on which makes me feel bad but once it is actually on they quickly forget about it and it doesn't bug them.

It's not a perfect solution the claws are finicky to work with, the applicator for the adhesive gets clogged with dry glue, the cats don't really like having the nails applied, the caps themselves are really quite expensive and some items like wooden furniture, still get damaged even with the soft claws but the damage in such cases is still greatly reduced and for soft items like a couch it pretty much stops them doing any damage.

One thing to keep in mind though, don't buy cheap ones off eBay, they're not worth a cent, the cats hate them for some reason, they don't seem as well manufactured and they don't come with little cleats inside to help lock it in place. Because the cats absolutely hate them so much and are definitely bothered by them they immediately pull them off straight after you've applied them and they're just a waste of your time and an unnecessary source of stress for your poor kitty. I've found 2 brands that seem to actually be good and they're both a lot more expensive then I'd like but at least you can buy a supply of several months and save your possessions from destruction. The two that worked for me are Soft Claws and Soft Paws. Claws seems slightly cheaper. I think they might actually be the same product since they have the exact same artwork and typeface for their packaging and logos and the caps themselves seem to be identical, one of them just says claws and the other paws. Weirdly enough they've both chosen to use Garfield on their packaging and somehow I'm fairly sure neither has paid for the privilege.

[–] [email protected] 2 points 5 days ago

I don't know why the kittens should be the one that hurt me the most to read but well, it did.

[–] [email protected] 3 points 5 days ago (1 children)

Damn did they get shamed to oblivion at least?

[–] [email protected] 4 points 6 days ago (1 children)

I think the confusion is that you seem not to like what is presumably Christmas because you perceive it to be fake but Festivus, is literally, actually, fake since it comes from a plot of a TV series from the 90s and has only been celebrated by a broader range of people since as a fun tribute to that series. You could argue that the fact that people really celebrate it means it necessarily can't be fake, but then by that logic...

[–] [email protected] 3 points 6 days ago

I kinda like it. I guess it helps that in my part of the world it's absolutely blazing hot in summer. I love that, but with the intense onslaught of sun over that period, by the time winter rolls back around it's kind of a welcome change. I also just look way better in winter clothes so it's nice to feel better about my appearance for that portion of the year. I also find that it's way easier to warm yourself up when it's cold than to cool down when it's hot. Don't get me wrong, I'm a big wuss so all summer I'll whine and moan about it being soo hot and then immediately complain about being freezing in winter, but on balance I think I find the discomfort of my region's winter a bit easier to deal with than its summer. I also like not being completely covered in a layer of sweat as well. I don't especially care a whole lot about when the daylight hours appear, I'm as happy being out and about at night as I am in the day and appreciate either for different reasons so if more of my waking hours are taking place in darker periods of the day then I'm just appreciating those for what they are just as I also appreciate all the bright and sunny hours. I would say that as someone who has trouble sleeping when it's too bright I definitely prefer it when the sun comes up later and doesn't wake me up. It probably helps that I'm hardly an outdoors-man so it's not like much if any of the things I'd actually do across a year are really curtailed by the mandates of the season, though I guess I do miss the beach. Besides, like a lot of people, I work indoors so a good chunk of any given day is taken up by a minimum 8 hours of work usually starting at 09 so when the weather is absolutely beautiful and sunny and clear I'll see it for about 20 minutes out the car window before going in to a building with the blinds drawn and the air-conditioning on until I emerge at what is then evening hours.

 

Is anyone else getting this? I've heard it a few times but the 2 recent examples I bothered to remember were this one:

https://youtu.be/oSG7HpdQ34w?si=sqra8x0x1igNFzRs&t=876

Where at around 14:37 the entire video actually, not just dialogue in this instance, went mute mid sentence and remained that way until 14:49

And then this one again today:

https://youtu.be/hS2emKDlGmE?si=XFpt_MsY2Nrff_MV&t=1788

At around 29:48 where only the dialogue cuts out. The first few times I ran in to this I assumed the video had just had an editing error but it's happening too often for that to be it. I noticed recently that my laptop will do some kind of automatic downmixing of 5.1 audio so I can hear it in stereo, but only if playing through my laptops speakers and not through the dock it's connected to with attached speakers, in which case it only monitors some channels in a 6 channel audio source but I checked that by switching to the internal speakers for the sections of mute video in question and it made no difference.

 

I used to search for a video file using spotlight and it would return several results and when it was a file I accessed more than a couple of times it'd be the top result. I'd see an icon of the application used to play the media type and the name of the file.

Nowadays, with the same keyword, I frequently can't find the file I successfully found before but also, even when it does find it, it doesn't display it anywhere near the top results, it's down in a section called 'photos from apps' which presents a grid of options, rather than a list, all of which represented by the VLC icon as it's my default media player but with NO filename. I have found it before because usually it's the one preselected, although not always. It's super frustrating not being able to actually see what's found. I think this is probably supposed to display photos as photographs or maybe videos using thumbnails, to make it easier finding an image compared to filenames given the name of the section 'photos from apps', but I'm not even looking for an image anyway and besides if I was and the thumbnails actually worked, I'd have to have typed the exact or at least similar filename to the image I'm looking for anyway making a visual search pretty useless.

To be clear, I'm not looking to get rid of the ability for spotlight to be able to search media such as videos or images, I just want the results of that search presented in the sane way they used to be back on High Sierra. (Probably persisted beyond that but I jumped from HS to Sonoma and now Sequoia).

 

It's way cheaper than the app store and Steam and Humble Bundle

 

This used to be an option, now it isn't. I found command line solutions which would be fine, but they're all for scheduling sleep for set times, I want set durations.

 

When this feature first came out I immediately disabled it because I noticed you couldn't stretch windows across two screens. I thought I might try putting up with it, but at the time I found it made it damn near impossible to use Avid Media Composer effectively.

It's been disabled in all the years since and I've never thought about it since until just recently where now twice in one month I've come across situations where something only works if it's enabled. One is for disabling the behaviour where fullscreening a video makes a second monitor turn black and now with Sequoia, the ability to use the native window tiling they just introduced.

I've always used magnet and then later rectangle to achieve the same and will probably just revert but I thought I'd try and give the native option a shot. I checked to see if that stupid shit with not being able to span windows has been fixed in the intervening period but it hasn't. Does anyone know if you can make it work with separate spaces enabled or at least why it works this way?

 

I don't really buy games much these days. I was trying to see what games would work on Mac and was pleased to see a new Assassin's Creed game is coming out on Mac natively. I was pretty stoked with this news, I've never played any of the AC games but they've always looked good.

I thought I'd check the Apple App Store to see if there were any other AC games that might already be out and there was only one option (actually on some 'App Store Preview' thing not the actual app store), called Assassin's Creed Mirage. It was listed as free to play with in-app-purchases. I'm really just not participating in that, can't stand that shit. I don't think I've actually bought any Ubisoft games since the Nintendo 64, are they all like this or is that just some unfortunate anomaly? I noticed also that it'd listed them collecting data about me, which, WTF?

Keen to wait till November for AC Shadows but not if it's going to be any of that nonsense.

 

I love that game and it's the best RTS I've played. It seemed to basically rip off the CIV games heavily but simplify them and put them in an RTS context. Everything I loved about Age of Empires as a kid but much better and also spanning the ancient age to the information age.

I run an M2 max mac, which makes things complicated, but I'm open to jumping through some hoops if such hoops exist to make something that wasn't supposed to work on Mac, work on Mac, but would need to know if it even can be done for that particular game. Also obviously direct compatibility out of the box would be great.

I really don't want anything that's multiplayer only, as I'm unlikely to ever play online and prefer single player games

I really hate free to play games and just want to buy the game in its finished state outright that will stay the same for as long as I own it and then just pay for any expansions or new additions at my discretion if they get released.

I'd like it to have the same all of history spanning scope for tech.

I like there to be air units and navy units.

 

Or could you write a virus or trick someone in to do doing just that? When did thermal throttling first become a thing for that matter?

 

It doesn't really make any sense how this could possibly be related, and for that reason I don't rule out some other factor being at play, but the correlation has been pretty evident on all 3 occasions when I tried to do this and the absence of the same effects when I don't do this and instead do pre-boil the pasta all seem to point it being the relevant variable.

Assuming the no-pre-boiling somehow is responsible, the obvious solution to the problem is to just stop doing that and indeed I have for fear of a repeat of the horrible experience, but it's just that the ease and efficiency of the method is so appealing and I would like to try it again, but I also really don't want to gamble on that unless I can be pretty well assured that the results were unrelated to the lack of a pre-boil or if that actually is a plausible cause, I'd like to learn by what mechanism this could possibly play a role and why it doesn't bother most people.

 

I recently bought an external PCIe enclosure so I could make use of a specific PCIe device in an editing setup. One of the nice things about this particular enclosure is that it also happens to come with an m.2 slot for NVME drives as well.

Usually when I edit with my home set up, I'm provided with the storage by the client, and even if not, at the very least, video media, plus backups takes up a lot of room and NVME drives are expensive so I'd usually opt for something cheaper as the actual location for the footage and assets. I figured then that it might be take advantage of an NVME drive of a smaller, more affordable capacity and use it just as a location for video render cache that I just clear after every project wraps. The high speeds of these drives seems like it would be a good fit for this purpose.

However I've heard that SSDs, including NVME are famously short lived and have particularly short life spans in terms of number of write operations. Is that still the case and would the constant writing and clearing of relatively small video files actually be kind of the worst use of one of these drives?

 

My understanding between TB4 and TB3 is that they're essentially the same, it's just that the standard of TB4 essentially mandates that the device must do all that TB3 maybe could do. Minimum bandwidth is increased and I think I read something about power delivery minimums as well. This eGPU chassis I bought came with it's own TB4 cable, which is actually the first Thunderbolt cable I've seen that specifically says "4" on it.

I assume the reason they supplied this is because, given what it does, an eGPU chassis is going to need to support some pretty bandwidth for a GPU. In my case though, I'm actually using this chassis not for a balls to the wall kick ass Graphics card, but actually to allow me to attach an old and very humble i/o card from Blackmagic. It's currently working just fine for that purpose.

Thing is, the supplied TB4 cable is pretty short and the chassis along with the ATX power supply mounted on it makes for a pretty hefty desk-space consuming setup. I'd like to move the whole setup somewhere fairly far off from the laptop to save me some precious desk space. I looked up 2m thunderbolt 4 cables which I understand is the longest distance you can get for TB4 and still maintain bandwidth and while it's not too bad, the prices are high for a cable. It occurs to me though that since I'm barely using a fraction of the available bandwidth anyway, could I use other, cheaper, long cables. USB4 comes up a lot in my search for 2m TB4 cables for example. (although they are mostly from AliExpress so don't know how good an idea it is to buy from them). If the chassis has TB4 controllers in it, as does the laptop to which it's attached, can one just put a USB4 cable between them? Are they physically different?

For that matter, since my bandwidth needs are so tiny, could I just find cheaper, longer TB3 cables?

 

I don't know my terminology very well. I just bought this eGPU enclosure. It also comes with an m.2 slot I suspect that's probably what this 4 pin power slot is for.

I have a spare ATX PSU to power this thing with and it's not modular, the cables come out of the PSU box in a big messy bundle and there's no where to detach or attach cables. There's lots of different connectors that come out of this bundle but alas no square arrangement of 2 rows of 2 pins as needed by this chassis.

There are however 2 such connectors that are kind of joined together through a little plastic catch, but in a manner where you can slide them apart. It's clearly intended that you can be able to separate these if you want to, but them being attached to each other in the first place has me a little worried.

The cable from which they each branch has TKG written on it and each of the connectors has L and R printed on it respectively. If I separate them, I can definitely fit one in to the slot, but is there any reason one shouldn't do this?

UPDATE: It works!! Initially the chassis wouldn't power on but I discovered that if I simply don't plug in the 4 pin slot at all then it does. I'm pretty sure that slot is for powering an m.2 drive if you have one and that was one of the things that made me decide to buy this particular chassis so it doesn't look great but I'm hoping that if I actually had an m.2 drive to test it with, that plugging in that PSU connector to the 4 pin slot would work, but at the moment, when there is no such drive connected, the entire chassis doesn't power on. Even better still, the blackmagic card works!! This is great because the manufacturer actually responded to my email asking if it would work too late and I had already ordered it and they said it wouldn't work so the fact that it does is a big relief. Word of advice for anyone testing this with standard computer monitors instead of proper reference monitors like me, it might say "out of range" or similar on your monitor for a lot of standard video frame rates, but for testing purposes, I was able to get it to work at 60p. No good for a real project, but hopefully with a real reference monitor that wouldn't be an issue.

view more: next ›