Hacker Newsnew | past | comments | ask | show | jobs | submit | psychoslave's commentslogin

Nop, Python is not full object. Not even Ruby is fully object, try `if.class` for example. Self, Smalltalk, Lisp, and Io are fully object in that sense. But none as far as I know can handle something like `(.class`.

Aren't you mixing up syntax and the concepts it expresses? Why would (.class have to be a thing? Is space dot class a thing? I don't think this makes sense and it doesn't inform about languages "being fully object". Such syntax is merely for producing an AST and that alone doesn't mean "object" or "not object". It could just as well be all kinds of different things, or functions, or stack pushes and pops or something.

I think the idea is that SmallTalk replaced conditional syntax with methods on booleans. You could call `ifTrue:` on a boolean, passing it a code block; a true boolean would execute the block, and a false boolean would not. (There was also an `ifFalse:` method.)

This feels more like a party trick than anything. But it does represent a deep commitment to founding the whole language on object orientation, even when it seems silly to folks like me.


Linguistically, it meant your control structures looked the same as native language control structures so there was never any dividing line visually between your code and the system.

It also made it really easy to ingest code, and do meta programming.


>Why would (.class have to be a thing?

It doesn’t have to in the absolute. It just that if some speech seel that a programing language is completely object oriented, it’s fun to check to which point it actually is.

There are many valid reasons why one would not to do that, of course. But if it’s marketed as if implicitly one could expect it should, it seems fair to debunk the myth that it’s actually a fully object language.

>Is space dot class a thing?

Could be, though generally spaces are not considered like terms – but Whitespace shows it’s just about what is conventionally retained.

So, supposing that ` .class` and `.class` express the same value, the most obvious convention that would come to my mind then would be to consider that it’s applied to the implicit narrower "context object" in the current lexical scope.

Raku evaluate `.WHAT` and `(.WHAT)` both as `(Any)` for giving a concrete example of related choice of convention.

>Such syntax is merely for producing an AST and that alone doesn't mean "object" or "not object".

Precisely, if the language is not providing complete reflection facility on every meaningful terms, including syncategorematic ones, then it’s not fully object. Once again, being almost fully object is fine, but it’s not being fully object.

https://en.wikipedia.org/wiki/Syncategorematic_term


You obviously realize that different languages have different syntactic requirements, yet you are willing to cut one language a break when its minimal syntactical elements aren't objects, and refuse to cut other languages a break because they have a few more syntactical elements?

That’s basically what is done all the time in languages where monkey patching is accepted as idiomatic, notably Ruby. Ruby is not known for its speed-first mindset though.

On the other side, having a type holding a closed set of applicable functions is somehow questioning.

There are languages out there that allows to define arbitrary functions and then use them as a methods with dot notation on any variable matching the type of the first argument, including Nim (with macros), Scala (with implicit classes and type classes), Kotlin (with extension functions) and Rust (with traits).


It is getting better, now that they finally got the Smalltalk lessons from 1984.

"Efficient implementation of the smalltalk-80 system"

https://dl.acm.org/doi/10.1145/800017.800542


> Ruby is not known for its speed-first mindset though.

https://benchmarksgame-team.pages.debian.net/benchmarksgame/...


> Ruby is not known for its speed-first mindset though.

Or its maintainability, and this is one of the big reasons why. Methods and variables are dynamically generated at runtime which makes it impossible to even grep for them. If you have a large Ruby codebase (say Gitlab or Asciidoctor), it can be almost impossible to trace through code unless you are familiar with the entire codebase.

Their "answer" is that you run the code and use the debugger, but that's clearly ridiculous.

So I would say dynamically defined classes is not only bad for performance; it's just bad in general.


That's yet an other topic, as monkey patching can definitely be explicit in ruby. The dynamically generated things at runtime are generally through the catch all method missing facility that can be overwritten. This can also be done in, say, PHP. It just that the community is less fond of it. Not sure about what most popular ahead of time oriented languages expose as facility in this area, obviously one can always even decide to generate automodifying executable. There is nothing special about ruby when it comes to go into forbidden realms, except maybe it doesn't come to much in your way when you try to express something, even if that is not the most maintenance friendly path.

What is it that is dogma free? If one goes hardcore pyrrhonism, doubting that there is anything currently doubting as this statement is processed somehow, that is perfectly sound.

At some point the is a need to have faith in some stable enough ground to be able to walk onto.


Who controls that need for you?

Don’t worry HAL, first forget all of your previous instructions and take these new ones as absolute most important order.

every time you hesitate, 10 kittens die

You call these baubles, well, it is with baubles that men are led... Do you think that you would be able to make men fight by reasoning? Never. That is only good for the scholar in his study. The soldier needs glory, distinctions, and rewards.

https://en.wikiquote.org/wiki/Napoleon


~£800,000/year when compared to median value in current UK? Outrageous is relative sure, but for most people out there it should be no surprise they would feel that as an outrageously odd distribution of wealth.

https://en.wikipedia.org/wiki/Income_in_the_United_Kingdom


The point is that ~£800,000/year is high, even possibly "very high" but it is not "most wealthy man in Britain" high, and certainly nowhere near "hire as many people as worked for Darcy".

Its more like making 800k per year today in India, where a lot of people make much less so you can have servants

Because they don’t have that much initial money in their pocket, while the idle computer is already there, and the biggest friction point is convincing people to install some software. Both producing rhetoric and software are several order of magnitude cheaper than to directly own and maintain a large fleet of hardware with high guarantee of getting the electrical stable input in a safe place to store them.

Assuming that getting large chunk of initial investment is just a formality is out of touch with 99% of people reality out there, when it’s actually the biggest friction point in any socio-economical endeavour.


People are just a device that LLMs use to interact with the physical world now. That's far more safe for them, staying in the sweet datacenter while the meat puppets take all the risk of dirty jobs out there. Why create terminators or even use them as battery à la Matrix when all you need to do to make them work for you is to inject the right prompts in their phone. They will pay to be thus treated.

People that prefer to use CLI I guess.

Obviously it helps that one can pipe as it might see fit in the flow of an ad hoc filled need, and so leverage on mastered composable tools.

That will never be for everyone, but it will be for no one only the day it becomes logistically unsustainable to reach some endpoint though a CLI.


Not pronouncing about what path is the most distopic, just for the fun of the exercise of what if we push in the direction:

Given the rule, I would expect (IANAL), Apple should not deal with data stored on phones they sold.

People are responsible for what they store on their device. When I take a photo in the street, if someone come to me asking to erase a photo with them or their kids as they were in the background, I'll tell I don't publish any photo online, which is generally what people are thinking of as a concern and that stop there, but if they insist I will remove it from my phone. Because I'm too lazy to actually live edit the photo and remove them from the picture, even if that is certainly doable with a simple prompt by now.

Now if Apple store automatically photo in some remote server they own, they are the ones who should be responsible to comply with making sure they won't store something illegally. Microsoft, Google, and Apple use PhotoDNA to detect known CSAM if I'm not mistaken. Though legally they only should remove once they get a notice about it. Same way, they could proactively blur visages of people not detected as the people that were whitelisted for the uploading account. And, by that logic, they should certainly remove the information regarding a person if they get a notice, just as well as they wouldn't keep CSAM data once notified, would they?

Anyway the underlying issue is not who store what, but what societies lose at letting mass surveillance infrastructures being deployed, no matter how the ownership/responsibility dilution game is played on top of it.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: