Pinned post

Say you're using a block of memory to store data that you're going to need later. Then you can't use that same memory for something else, unless you swap out the data or compress it or something first, right?

Not quite!

falsifian.org/blog/2021/06/04/

Pinned post

Hi! My main skills are in theoretical CS and . I'm interested in , , (playing the ), (the board game), and lots of other things I'm forgetting.

I've been taking time to relax and play after having most recently worked at . Some things I've been doing:

* teaching

* projects, e.g. falsifian.org/blog/2020/11/18/

* , mostly and

A question: how many of you have friends who use Mastodon who you met elsewhere?

James Cook boosted

streetcars normally have signs on many of the seats marking them restricted because of the pandemic, but mine this evening has none. Maybe they are retiring those.

Come explore a strange mathematical object called a 3-sphere! I made a demo of non-Euclidean using , and the idris2-dom library.
falsifian.org/blog/2022/01/17/
sr.ht/~falsifian/s3d/

Why isn't the BSD0 license more popular? I use it sometimes, because my ideal is that people who use my code shouldn't have to think about copyright law. I don't like copyright, and resent that my work is encumbered by it whether I like it or not. I would like people to give me attribution, but legally forcing them to doesn't sit well with me. Is this an uncommon sentiment? Is BSD0 not a good fit for me for some reason? Is there some other concern, e.g. some legal uncertainty?

I'm reading Heaven's River fourth in Dennis E. Taylor's Bobiverse series. His works aren't especially deep, but are light-hearted and fun. The series is about a person named Bob whose brain gets copied into a Von Neumann probe, from which he proceeds to make lots more Bobs so they can save the world together.

James Cook boosted

programming, long 

@cadadr@weirder.earth @operand When it comes to C, it is very platform dependent. E.g. if you're using the GNU toolchain (autotools, pkg-config, and lib-something, I forgot), it's actually fairly straightforward to include dependencies. But the main problem there is, the way C and C++ handle modules of programs is.. well, I don't want to say broken, given the age of languages, but definitely outdated.

IIRC it was also fairly straightforward on Windows as well, but I only coded on windows as a kid back in 2005, so I can't speak to that.

For other languages, package managers actually aren't the newest thing. CPAN for Perl and CTAN for TeX predate stuff like NPM by a decade and change.

IIRC you'd mostly use your distro's packages, or use your IDE on Windows to get libraries, or if you were a serious developer, you'd use the language package manager, say cpan. But the end user or even the beginner coder hardly ever heard of these tools, let alone needing them.

What really changed circa early 2010s was, we went from the package manager / repo as a curated community run place which keeps tabs on what's published to a wild west where shit comes directly from git repos of random coders. In my experience this was driven by node.js and especially npm, which brought in a lot of non-programmers who did not really know the technical implications of the decisions they were making into this packaging space.

This was on the one hand good ofc, democratising programming a bit, but also lead to a cultural shift towards some pretty horrid programming and software distribution practices (e.g. the infamous node_modules, there's a reason you don't have infamous vendor/ for ruby, or infamous ~/perl5 for perl).

In that culture we got used to "reinventing the wheel", "release early release often", "move fast break things"; these were like shibboleths of Node.js "subculture" back then.

We also got the tiny libraries with this development. Like, it was common, say if you were a C++ dev, to use boost or Qt for most of the functions that STD didn't include. Python always had a huge stdlib, and while Perl didn't have it as large, it was still pretty large. Ruby, well, you either did Rails with it or Textmate extensions, so that was okay, and even then the core libs are way much richer than what the 0.x releases of Node.js by Ryan included. If you needed to do something with images, you used ImageMagick, not a bunch of tiny libraries that adhere to a made-up-after-the-fact so-called "unix philosophy".

So people, instead of bothering to come together and create a stdlib and utility libraries for this new programming language (basically, it's JS but it's not really the browser JS), started packaging up little bits of functionality into tiny packages, and because a lot of people using these were amateurs and/or beginners, they didn't have the experience of how to integrate others' code or maintain software. Node.js immediately got corporate attention tho, and this meant that this amateur culture that's constantly "reinventing the wheel" without any engineering skills suddenly become commercialised. Joyent bought node.js, Ryan disappeared, Isaac became the maintainer, and these problematic practices ossified in the node community.

Then this ofc became the way of doing things in the start-up coding culture, which to this day is where mainstream programming culture is created, and inevitably seeped into other programming language communities, and became the default attitude. Move fast and break things, tiny isolated and "opinionated" libraries, version numbers going up like covid cases.

But the thing is this attitude is error prone and tiresome, like physically tiresome. You never know where to start, docs are out of date at the day of release---if they exist at all---, tools that you need to be transparent are fresh and buggy in day to day use, eveything is ad-hoc and barely compatible if ever at all, every package is opinionated, and tiny, and developed by a single guy who has 100 other packages that they maintain.

If you're developing something like boost, django, image-magick or similar, you can afford to build a team and infrastructure around it, like a bugzilla or debbugs, source code hosting, mailing lists or an IRC channel, etc. But if you're the sole dev of a 100 packages whose average content is 100--500 lines of code, you can't afford that, so you're beholden to forge silos like Github.

These existed before too, like SourceForge or whatnot, but the difference was, your program needed 1 or 2 major libraries, and then a few utility libraries here and there to make things more comfortable. With node.js mainly, we got a culture of use libraries as much as you can, tho. The idea was, if you avoid writing code, you'd avoid bugs, but the reality was, those bugs were moved to libraries and the glue code that made the 100 libraries work together.

An exception was CPAN ofc, because they did have a lot of packages and a culture of smaller (but not as small) libraries. What they did differently tho, was distributed, community run, constant testing of all packages. You can trust a package from CPAN because 100s of people test it and test results make it back to CPAN. I didn't really use CPAN a lot so idk the specifics, but Perl-Testers is a unique beautiful thing that I wish every language had: qa.perl.org/cpan-testers/

So all this stuff lead to the current situation that's "enjoyed" by not only the Node ecosystem, but by other languages, distros, and even tools like Docker today. There's no QA, everything is either a corporate git repo or an unemployed maintainers github, and it's considered okay to rely on this stuff because... well, because we kinda found ourselves inside this situation and whole generations of programmers were raised in it. And when you want to introduce QA or a CoC or what not, you're a gatekeeper, anti-progress, anti-innovation, or whatever other nonsense SV nerds come up with.

So when you say

the tools represent a certain kind of culture wrt dependency management, and automatically including transitive dependencies is an anti-pattern.

you're extremely on point, really hit the nail on the head, so to speak: this entire culture that was emerging early 2010s got mainstreamed by the tools developed around it and ossified as we neared 2020s, and know-how and techniques developed over the 90s and '00s were stigmatised (with a lot of ageism too) and forgotten, and now programming using 3rd party libraries became a minefield.

I finally set up an bouncer. pounce from causal.agency/ is proving simple and effective. (catgirl is also quite nice.)

I like to keep plain text IRC logs. I figured out I can take advantage of pounce's -v option to do that by running it like this:

pounce -v "${CONFIG_NAME}.conf" | \
awk '/^(<<|>>)/ { printf("%s\t%s\n", strftime("%FT%T"), $0); fflush(); }' >> \
"$LOG_FILE"

James Cook boosted

Can't recommend DigitalOcean 

DigitalOcean was just 'ok' in my books - they seemed fairly reliable and their VPS services seemed OK, but...

I started to question their practices when I found a couple of years ago their web login that started sending out a 2nd-factor code via Email was kind of broken. There's a button on the login to "Trust this device for 60 days", but no matter whether that was checked or not, it never worked.

When I questioned them about it, they basically stated they knew it was an issue, it was unlikely to ever be changed, and one should enable "real" 2FA on their account.

??? I wasn't thrilled with that, but went on my way, figuring I would start shopping around for something else that would work.

I presently have one last little VPS there that I have to get a client's DNS off of before I can shut it down. I went to check up on my services & whatnot via the DO website today, and their little broken 2nd factor code wouldn't show up in my email.

Checking to see if it arrived or not, I find that their emails are now coming via "mandrillapp.com" (mailchimp, apparently).

Their friggin' corporate email with customer 2nd factor login codes is being delivered via a 3rd party SMTP server.

I don't understand how companies make such nonsense decisions unless they're incompetent. DO is basically saying they don't have the expertise to run a mail server.

I'm sorry, I can't recommend an organization that does this - venture forth warily!

(and I can't wait to shut down my last VPS there)
James Cook boosted
James Cook boosted

@eludom@fosstodon.org Green screen, storage tube, can draw "1500 vector-inches per second", but the I/O interface is limited to 9600 baud in this case, which slows it down quite a bit. Individual vectors cannot be erased, the entire screen must be blanked and redrawn (which causes a flash to all-bright and then all-dark and takes about 750 ms).

Very cool devices, very obsolete. Just my kind of thing. ;-)

It's sometimes said OpenGL uses homogeneous coordinates, but I think it's more accurate to say it uses these coordinates, if you look at the clipping rules. E.g. the triangle (0 0 0 1) (1 0 0 1) (0 1 0 1) takes up half the viewport, but (0 0 0 -1) (-1 0 0 -1) (0 -1 0 -1) won't get drown at all.

Also relevant to me because I'm working on a geometry demo where this coordinate system is really useful.

Show thread

question:

With homogeneous (aka projective) coordinates, we say av = v for any nonzero scalar a and vector v.

What if you require a to be positive? Is there a name for the resulting coordinate system?

For example, (1, 0, 0), (0.1, 0, 0) and (10, 0, 0) are all the same point, but (-1, 0, 0) is a different point.

The set of equivalence classes is isomorphic to a sphere, but I can't call them "spherical coordinates" because the name's taken.

was beautiful, but I wish the pace were slower. I'm glad it didn't try to cover the whole book at once.

Also, I wish I'd brought ear plugs! The theatre set the volume too high.

James Cook boosted
James Cook boosted

Hi everybody!
I am an expert on deciphering old German script - for example, letters, diaries, birth certificates - that were written between about 1650 and 1950. I'm planning to toot funny or weird findings from my work and from my private genealogical research, and also give general advice regarding German genealogy.

#introduction

$ man bundle
No manual entry for bundle
$ bundle --help
BUNDLE(1)

NAME
bundle - Ruby Dependency Management

SYNOPSIS
bundle COMMAND [--no-color] [--verbose] [ARGS]

...

() question:

My phone often misses calls and shows me SMSs late. E.g. my phone will buzz, and I'll open it and say I got an SMS 10 minutes ago. Or I'll check my phone and see there's a missed call, but I'm pretty sure it never buzzed. In places where reception should be okay.

Who or what is likely to blame for this? I don't really know how calls and SMSs work. Is it likely a LineageOS bug? Or something I could complain to my provider (Freedom Mobile) about?

Show older
Mastodon @ SDF

"I appreciate SDF but it's a general-purpose server and the name doesn't make it obvious that it's about art." - Eugen Rochko