• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: July 8th, 2023

help-circle

  • Removing bias from IQ tests is one hell of a challenge, but if we put that aside and only analyse IQ results from people from similar backgrounds, it definitely measure something, and it usually gives accurate results. Meaning your score would not change much by taking the test again.

    IQ score correlate with someone general ability in pattern recognition, languages, logic, bias check and etc. It also correlate with grades, salary, lifespan. So, is that intelligence? I don’t know, but it is something.


  • I’d argue given enough time and effort almost anyone can become a domain expert in specific things and do incredible stuff. What distinguishes smart people from simpler folks usually boils down to them having a very easy time processing new stuff, which includes the ability to filter noise and fact check.

    I don’t like the term “stupid”, but there hasn’t been a whole lot of evidence supporting the idea that human intelligence is compartimented. Humans with high IQs tend to outperform in average at most of what they try. Low IQ probably means you will work harder and have to specialize to achieve the same degree of competency. This just my hot take, I’ve fallen into this rabbit hole before and read a lot on the origin of IQs tests. In the end, intelligence alone does not determine a person’s worth anyway.


  • IQ tests were first developped because it seemed obvious not all students performed equally. On average a student that is good in a given discipline will also tend to do well in other unrelated disciplines. On average is the keyword here, outliers exist.

    I think gifted students can easily tell what side of the curve they’re on, even though they might not want to acknowledge it. It is not even avout the grades, because gifted students also often learn early on that they can get away by doing the minimum amount of work and still get passing grades. So they’re not necessarily top of classes.

    Gifted students get told they’re fast learner all the time, and they notice how everyone else seem to be progressing in slow motion. They know.

    I think it gets harder to self-evaluate the closer you are to the average, since most of your peers will be more or less just as intelligent as you. Then, the dullest you are, and the less you can identify competense and the more likely you are to be over-confident.

    I think in the end, most people will end up believing they’re above average because we tend to notice dumb people a lot. Ironically it is probably students who are just slightly above average who will have the most self-doubts, because they feel different from their peers, yet they can probably tell more gifted students are around.

    Source: 50% my ass, 50% being surrounded by incredibly smart people who shared their personal experiences with me.


  • Elderos@sh.itjust.worksto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    There is a nuance though, because a language simply being interpreted does not mean it is being used as a scripting language. Take for example Java and C#, those languages are interpreted by default which allow you to ship platform-agnostic binaries and a bunch of other neat features. C# can be used as a scripting language, whenever it is interpreted, but it does not have too. It is an important nuance and this is why you can’t just replace the term “scripting language” entirely. You can also compile C# directly into machine code, skipping the interpreter entirely. Technically, there is nothing stopping you from writing an application that use C# as a scripting language even without the interpreter, since you can compile c# to machine code and simply dynamically load the library at runtime (kind of like Unity does). I guess you could call those “embedded languages”, and it would mean almost exactly the same thing, but then, aren’t we back to the same problem of some developers taking offence from that? I mean, it does imply that the language does not stand on its own without machine code just as well, which is true. This is one weird hill to have a bruised ego over for those developers you’ve met. Words have meaning and this one just happen to be a right fit given the description. I have a feeling from this whole exchange that you didn’t know what scripting languages were, considering how you replied to my first post. I worked in development for over a decade and I have never seen it be used with negative implications. I really just think you personally projected your own feeling onto a term you didn’t understand. No offence intended, it happens.


  • Elderos@sh.itjust.worksto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    There are definitely people out there shitting on all sort of languages, and JS is a huge target, but those have been referred to as scripting language for as long as they existed. It stern from the fact those languages are embedded into existing applications, as opposed to being built into binaries. Nowadays you have hybrids like C# which can used as either a scripting language or to build native app (or in-betwee), so it is really just a matter of the context you’re using the language in. There is inherently no hidden meaning or elitism in the term. It is a very old term and I think you simply got the wrong impression from your internet experiences. It is how those languages are defined basically everywhere. Even some of those languages official definition from their own website self-define as scripting languages. There is no ambiguity here at all.


  • Elderos@sh.itjust.worksto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    10 months ago

    I wanted to get back to you, because you are so very right, and I spent the last 10 years or so trying to evangelize the fact that implementing algorithm and logic isn’t the hard part, it is a trivial concern really. Everything that go wrong with development usually involve the flow of data, and figuring out how to get this data from over here to over there without making a big mess. To do that, you absolutely need to write small module with few dependencies. You gotta think about the life-cycle of your objects, and generally follow all the principles of s.o.l.i.d if you’re doing OOP. Personally, I really love using dependency injection when the project allows for it.

    It is as you said really, you can have thousands of hours of programming experience but if you never tried to solve those issues you’re really limiting yourself. Some devs think designing software around your data instead of your algorithms is overthinking it, or “overengineering” as I have been told. Well, I would not hire those people for sure.

    I have seen clean project made up of small modules, with clear boundaries between data, functions and the lifecycle configurations. It is night and day compared to most code bases. It is really striking just how much of the hidden, and not-so-hidden complexity and goo and hacks and big-ass functions in most code base really just exist because the application life cycle management is often non-existent. In a “proper” code base, you shouldn’t have to wonder how to fetch a dependency, or if an object is initialized and valid, and where to instantiate your module, or even what constructor to invoke to build a new object. This take care of so much useless code it is insane.

    To close on this, I like scripting languages a lot as well, and you can do great things with some of them even if lot of developers don’t. JS has Typescript, ReactiveX, dependency injection framework, and etc. It is a great language with a lot of possibility, and you’re not forced into OOP which I think is great (OOP and functional programming are orthogonal solutions imo). But the reality is that the language is really easy to misuse and you can definitely learn bad traits from it. Same as you, I would be wary of a developer with no experience with strongly-typed languages, or at the very least TS. I am very happy to hear this take randomly on the internet, because in my experience, this is not how most developers operate, and imo it is demonstrably wrong to not design applications around your data.





  • Elderos@sh.itjust.worksto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    18
    ·
    10 months ago

    I worked under a self-proclamed Python/JavaScript programmer, and part of the job involved doing rather advanced stuff in various other typed languages like c# and c++. It was hell. The code review were hell. For every little tiny weenie little things we had to go through “why coding c++ like it is python” is a very bad idea.

    What is crazy about developers who exclusively work with scripting languages is that they have no conception of why general good practices exist, and they often will make up their own rules based on their own quirks. In my previous example, the developer in question was the author of a codebase that was in literal development hell, but he was adamant on not changing his ways. I’d definitely be wary of hiring someone who exclusively worked with scripting language, and sometime it is less work to train someone who is a blank slate rather than try to deprogram years of bad habits.



  • In some countries we’re taught to treat implicit multiplications as a block, as if it was surrounded by parenthesis. Not sure what exactly this convention is called, but afaic this shit was never ambiguous here. It is a convention thing, there is no right or wrong as the convention needs to be given first. It is like arguing the spelling of color vs colour.




  • Elderos@sh.itjust.workstoMemes@sopuli.xyzOuch
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Yep, in big studios the big guys making the decisions really couldn’t care less what product is actually being made. They expect X return on investment by Y date, and you better be shipping your game then because ressources are already being reallocated to that bew project that was already in pre-prod as you were finishing the previous one.

    Game devs are also artists in their own way. It sucks for them when a game, sometime one that had lots of potential, gets released in an unfinished state. Your reputation takes a hit, people blame the QA and loot devs, but really the big guys are almost always to blame. More mid-term money that way, less bonus to pay, players still buy the unfinished games, and etc.



  • What you seem to be describing is one big class with lots of responsabilities, and not circular dependency. Personally, I don’t think it is ideal, and I don’t know about your specific case so I could be wrong, but I have never seen a legit case for bloated classes. That being said, making a big class is still much better than splitting it into inter-dependant classes. Classes that know each other are so cohesive that they might as well be the same class anyway.

    To add onto the circular dependency problem, it is not just about readability and cognitive load (though there is some of that), but cyclic dependencies actively break things, and make it much harder to manage the lifecycle of a program. No dependency injection, poor memory management, long compile times. It is a huge hack, and I understand that you think it can be the proper solution sometime, but it is really just a bad thing to do, and it will bite you some day. And I am assuming here that you’re using a language that is friendly, in some languages you won’t even compile or run past a certain point, and good luck cleaning up that mess.

    edit: replaced “module” with “class” for consistency


  • It does not get more complicated to split your example. What gets more complicated is giving all sort of unrelated responsabilities to a single class, simply because it is the path of least resistance.

    In your example, all you need is an extra module listening for configuration changes and reacting to it. This way you leave your context-specific logic out of your data model, no need for cyclic dependency. There are so many downsides to cyclic dependency, to justify it because splitting your logic is “too complicated” really isn’t a strong argument.