Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
sgillen
on July 10, 2020
|
parent
|
context
|
favorite
| on:
Why general artificial intelligence will not be re...
As long as all those reals are computable ;)
https://en.m.wikipedia.org/wiki/Computable_number
ggggtez
on July 10, 2020
[–]
It's fully possible for a computer to do symbolic math without reducing the symbols to a decimal.
sgillen
on July 11, 2020
|
parent
[–]
Of course! I meant my comment to be tongue in cheek, but saying that some numbers are uncomputable is not incompatible with symbolic math on a computer. Note that for example Pi is computable, despite having an infinite decimal expansion.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: