All in the MedAngle Super App - literally everything a future doctor needs in one place. 100k+ users, 150m+ questions solved, tens of billions of seconds spent studying smarter
I really can't believe they discontinued the Pro Display XDR.. what is wrong with them? A company the size of Apple, surely must have the resources to update it every couple of years.
Inference is run on shared hardware already, so they're not giving you the full bandwidth of the system by default. This most likely just allocates more resources to your request.
The ruby::box thing looks pretty interesting, from a cursory glance you can run two simultaneous versions of something like a feature or rollout much more conveniently.
Also being able to do
if condition1
&& condition2
...
end
on multiple lines rather than one - this is pretty nifty too!
I'm kinda hoping that eventually each ractor will run in it's own ruby::box and that each box will get garbage collected individually, so that you could have separate GCs per ractor, BEAM-style. That would allow them to truly run in parallel. One benefit should be to cut down p99 latency, since much fewer requests would be interrupted by garbage collection.
I'm not actually in need of this feature at the moment, but it would be cool and I think it fits very well with the idea of ractors as being completely separated from each other. The downside is of course that sharing objects between ractors would get slower as you'd need to copy the objects instead of just sharing the pointer, but I bet that for most applications that would be negligible. We could even make it so that on ractor creation you have to pass in a box for it to live in, with the default being either a new box or the box of the parent ractor.
They already truly run in parallel in Ruby 4.0. The overwhelming majority of contention points have been removed in the last yet.
Ruby::Box wouldn't help reducing contention further, they actually make it worse because with Ruby::Box classes and modules and an extra indirection to go though.
The one remaining contention point is indeed garbage collection. There is a plan for Ractor local GC, but it wasn''t sufficiently ready for Ruby 4.0.
I know they run truly parallel when they're doing work, but GC still stops the world, right?
Assuming you mean "because with Ruby::Box classes and modules have an extra indirection to go though." in the second paragraph, I don't understand why that would be necessary. Can't you just have completely separate boxes with their own copies of all classes etc, or does that use too much memory? (Maybe some COW scheme might work, doodling project for the holidays acquired haha)
Anyway, very cool work and I hope it keeps improving! Thanks for 4.0 byroot!
Yes, Ractor local GC is the one feature that didn't make it into 4.0.
> Can't you just have completely separate boxes with their own copies of all classes etc, or does that use too much memory?
Ruby::Box is kinda complicated, and still need a lot of work, so it's unclear how the final implementation will be. Right now there is no CoW or any type of sharing for most classes, except for core classes.
Core classes are the same object (pointer) across all boxes, however they have a constant and method table for each box.
But overall what I meant to say is that Box wouldn't make GC any easier for Ractors.
In languages where placement don't matter, like c/js, I prefer leading booleans. It makes it much easier to see the logic, especially with layers of booleans.
Personally && in the new line seems to be much better readability. Can’t wait to use some smart cop to convert all existing multiline ifs in my codebase.
For folks who scan code bases based on the front of lines, it makes it easier to grok. Also helps with deleting and inserting lines (similar to leading or trailing commas in lists).
It's funny how I have been doing this way of writing the conditions in languages, where one can, like Python (if you use a pair of parentheses) and linters have yelled at me for ages to put the binary operator on the previous line. People herald these quite subjective things like truths, just because there is some tool, that they can delegate responsibility to.
There are others as well but NVidia is aggressive when it comes to punishing companies willing to buy non-NVidia products. As a result, they prefer to remain under the radar, at least until they have enough market leverage to be more widely known.
I imagine 2 big giants basically (Nvidia/google/amd basically influenced by a few select of people) vs (Chinese companies who have investments from the govt)
its sort of like proxy wars and this is sort of whats happening in software side of things with open source models but I think that the benefit of the proxy wars is going to be for the end consumers
But although on the other hand, having two large countries compete with each other while buying everything else and all feels like it astronomically increases the price if someone wants to compete with these two giants (any other country perhaps)
We definitely need a better system where it doesn't feel like we are seeing pacman eat everything up basically
Is it not? All this money is going into AI under the fear that China will win the race to AGI. China releases open-source models that keep OpenAI/Anthropic researching and training their models, which in turn creates demand for more Nvidia GPUs.
I'm working on MedAngle, the world's first agentic AI Super App for current and future doctors. Invite only, 100k+ users, 150m+ questions solved, tens of billions of seconds spent studying smarter.
All in the MedAngle Super App - literally everything a future doctor needs in one place. 100k+ users, 150m+ questions solved, tens of billions of seconds spent studying smarter
https://medangle.com
reply