• 0 Posts
  • 4 Comments
Joined 7 months ago
cake
Cake day: April 9th, 2024

help-circle

  • Yes. Look at cost-benefit-ratios.

    For my requirements it has mainly been AMD in the past (and ATI when it still existed), because usually, intel and Nvidia charge you much more money and don’t really delivered that much more benefits for what I was looking for. They charge more, because they can, as they are dominating their respective markets.

    However, there might sometimes be factors which would still lead to a higher benefit compared to AMD. All depends on your requirements and how much you would benefit by the respective device.


  • You are literally wrong. Nice article, don’t see how that’s relevant though.

    Could it be, that you don’t know what “intelligence” is? And what falls under definitions of the “artificial” part in “artificial intelligence”? Maybe you do know, but have a different stance on this. It would be good to make those definitions clear before arguing about it further.

    From my point of view, the aforementioned branches, are all important parts of the field of artificial intelligence.


  • I totally agree with Linus Torvalds in that AIs are just overhyped autocorrects on steroids

    Did he say that? I hope he didn’t mean all kinds of AI. While “overhyped autocorrect on steroids” might be a funny way to describe sequence predictors / generators like transformer models, recurrent neural networks or some reinforcement learning type AIs, it’s not so true for classificators, like the classic feed-forward network (which are part of the building blocks of transformers, btw), or convolutional neural networks, or unsupervised learning methods like clustering algorithms or principal component analysis. Then there are evolutionary algorithms and there are reasoning AIs like bayesan nets and so much much much more different kinds of ML/AI models and algorithms.

    It would just show a vast lack of understanding if someone would judge an entire discipline that simply.