https://www.wolfram.com/language/

TODO

I think I should take up the subscription to this. India pricing https://www.wolframalpha.com/pro/pricing 10 USD per month.

Summary from video with Lex Fridman

video - https://www.youtube.com/watch?v=PdE-waSx-d8

  1. the formal languages based on which we create the the tower of consequences.
  2. computational reducibility is where we humans add value.
    1. it doesn't even matter whether it is done by humand or AI. The agent doesn't matter at all. Can something be done in a reducible manner is what matters. If it cannot be reduce then it has to be computed to know it.
  3. AIs have better sensory data than we humans are able to have
  4. humans have single thread of thinking
  5. the language is coherent and consistent, makes it easy for AI to understand
  6. encapsulate wisdom using natural language
    1. there is a structure in the language that humans have shaped over the time but have not been able to understand it for themselves enough to formalise it. AI was able to understand that better and help humans. but the irony is that AI and humans doesn't have a common language which can transport that wisdom from AI brain to human brain.
  7. humans knew about the grammatical rules of the language but the language has semantic syntax too which they couldn't formalise.
  8. the formal structure in the conversation is the logic.
  9. Aristotle discovered logic by looking at lot of sentences, structure in them and what it meant.
  10. Aristotle and syllogistic logic
  11. Chatgpt is still at the artistotle level where is depending on the structure of the sentences. comupational is what is equivalent to the boole level of long where you can nesting, deduction and so on, where you create the tower of consequencies based on the basic premises that you have formalised.
  12. chatgpt is templates of natural language
  13. what chatgpt showed is that there was more that could have been lifted out of language but he stopped early.
  14. chatgpt is helping us find more laws of semantic grammar.
  15. syntactic (basic rules defined to form sentences)-> semantic (rules set for the sentences to be meaningful based on humage experience and imagination)-> thought (rules that helps in creating thoughts that consistent)-> problem solving(utility)
  16. words are defined by our social use of them
  17. language is a way of packaging thoughts so that we can communicate to other mind.
  18. computational tower
  19. poetic aimless nonsese
  20. humans = there is an abstraction that can be passed on from generation to generation
  21. computation is more rigorous - beyond human realm too is possible in the computational representation of the world.
  22. chatgpt is shallow and wide, abstraction based on the human language available. computational is narrow and deep.
  23. humans discovered the computation and they invented the technology for computation.
  24. bilogoy does lot of computation.
    1. semiconductor based(computer) computation is just one form.
  25. the tool that does things outside the human brain
    1. doing it some something that is completely built by human. we do use silicon but the intelligence part of it was all created by humans and it is outside the human brain and the intellgence of biology
  26. how can something so complicated arrive from something so simple?
  27. neural nets generalise in the same way that humans do
  28. neural nets can be good at anything that humans can do quickly. just imagine what are the things you could do if you had access to all the text in the world and your brain had better memory and compute capbilities but same level of intelligence/depth neural nets will be good at it.
  29. collective intelligence of the species
    1. wolfram feels that AI trend might push people to become generalists.
  30. gokul tweet - if we are not careful very soon the temperature settings in the models will define how experimental the humanity will be.
  31. picking the possibilies that we want
  32. universe does what it wants to do.
    1. we are the aberrations where we go against the universe and do what we want to do and we call it agency.
  33. human extinction by nature and AI are both equally probable?
    1. we understand the nature better and most of the changes are slow? there is a precedent
    2. we are still understanding AI and some of the things are just not understandable.
  34. it works because it works
    1. this is true for both nature and AI too?
    2. we find more generic things about nature like laws of motion etc so we know why something behaves the way it does.
  35. steerability matters a lot to humans?
    1. even if we don't understand how something works as long as we are able to steer it, we can find out more about its internal working without feeling worried.
  36. rulial space
    1. space that contains all the computational possibilties of all rules.
    2. the current world can be considered one manifestation of those rulial space.
      1. within that how humans percieve the world and say other animals percieve it can be be differernt.
      2. so how do we translate our sensory experieince and our model to other forms of intelligence or other creatures can completely fast track the betterment of our models that can be used for various things.
  37. what do you care about is more important
  38. implementation choices
    1. nerve cells based intelligence (human thinking)
    2. silicon based computation
    3. molecular based, biology or eco systems intelligence
  39. what kinds of abstractions are natural in this kind of system?
  40. biological evolution is slower
  41. computation through symbolic reasoning, using natural language interface of wolfram
  42. it will be the correct consequence of the rules you curate
    1. so we need to curate the correct set of rules/data
  43. i haven't thought this through properly
  44. giant chains of computational contracts
  45. we follow this procedure. we are transparent about it. we might get it wrong. atleast we won't be corrupt about getting it wrong.
  46. collective understanding of languages
    1. wolfram gives an example where has a list of five facts that he wants to convery. if you just shared them as is, most people may not understand. but if you use an LLM to connect all those 5 in a good way, add some anologies etc, most of the people might find it interesting in addition to understanding it.
  47. natural language produced by the LLM is more like a transport layer.
  48. LLM are statistical continuation of language
  49. LLMs generate language. whether it is true or not is difficult to find out. when you reduce the temperature, it sticks to what it has come across in the past so it is plausible that it is true. when temperature is set to high, first it will get imaginative, then it will go bonkers where even the language it produces may not be the language that we generally come across.
  50. democratisation of access to computation
  51. learn it before you use it, to use it before you learn it
  52. gokul - this also means that go broad quickly and see what is capable with limited investment and then go deep where you want to. so you can quickly explore the surface of possibilities before you go deep to find the actual implmentation and value creation that depends on towers of specificity

    All notes