43 Comments

  1. Lex Fridman began to say some very interesting things about the short term threats of AI, and then the conversation became sidetracked with anticipating long range possibilities such as an embodied general intelligence capable of learning from the physical world. There's not nearly enough focus on the current – and historical – impact of AI technologies, although Lex's observations about reproducing biases inherent in data are very germane. There's a bigger conceptual challenge we face, in relation to the "goodness" of technology. Climate Change is forcing this conversation to happen, to some extent, but there's not enough exploration of the full implications of this. We need to radically examine our assumptions about the role of tools – in general – in relation to both human happiness and fulfillment, and the health of broader ecosystems. Agriculture, for instance, has both made it possible to support many more people, but:
    (1) At times and places, has resulted in a stunting of human growth and potential due to poor nutrition (more people does not mean healthier people, always). Counter-intuitively for some, there is evidence that hunter-gathers were generally bigger and healthier than most people living in the first civilisations based on cultivation of cereal crops.
    (2) There is growing evidence that strongly suggests that the current dominant systems for producing meat and plant food are not environmentally sustainable, and that agricultural systems have led to civilisational collapse in the past due to being environmentally and socially unsustainable.
    My point here is not that farming is bad. My point is that the very short term benefits of a set of tools (cultivation) can blind us to unforseen and poorly understood negative impacts in the medium, long and very long term. Ideally, what this means is that we should pay more attention to mitigating long term risks in relation to particular technologies. Is ecologically sound agriculture less intrinsically interesting or technologically challenging than conventional agriculture? I don't believe so – it is just that far fewer resources are allocated to it.
    How resources are allocated to the development of technologies is a critical question, as it involves some rather fundamental technologies – money and markets. And it is in this space, that I think there is a much greater and urgent need for critical engagement on the development and impact of algorithms and AI learning systems. To a significant extent, the allocation of resources to the development of AI learning systems is being driven by very simple algorithms – such as those people with power over resources use to determine the extent to which Google has value, which revolve around optimizing and calculating profit. In as much as money, markets and alogrithms are tools that we have invented that are important drivers and building blocks for the development of AI, I think it is really necessary for us to begin to use and develop these tools in far more reflexive ways than we currently do. I have a deep hunch, which I am trying to unpack, that the ability to develop a fully reflexive understanding of technology (including social technologies such as money) is a fundamental bottleneck for intelligence.

Leave a Reply

© 2024 FYTube Online - FYTube.Com

Partners: Omenirea.Ro , masini in rate