Hey Cred Forums, can a computer ever make a real mistake? I don't mean user error, programmer fault...

Hey Cred Forums, can a computer ever make a real mistake? I don't mean user error, programmer fault, or alpha particle flipping from cosmic rays. Just true mistakes made by computers.

Also Cred Forums humor thread go >>

Other urls found in this thread:

en.wikipedia.org/wiki/Big_O_notation?oldformat=true#Example.
en.wikipedia.org/wiki/Soft_error#Causes_of_soft_errors
twitter.com/NSFWRedditImage

...

...

no

assuming no hardware faults, computers will always, consistently, and perfectly, do what it was told

every mistake is human

At what point does it become the computers fault when dealing with AI? If a learning AI is building a simple program and it used a float rather than int type and needs integer division, is that the computer's fault, or the programmer's through a few levels of abstraction?

even an AI that apparently "makes a mistake", is only a result of the code humans wrote
computers don't even know the concept of "mistake"

you can get unpredictable results through hardware faults and random data obtained externally (RNG-gathering hardware), but calling those "mistakes" might not be accurate
i mean, humans still told the computer to produce an unpredictable result, so an unpredictable result is what it was meant to produce

explain

Q: Why didn't Intel call the Pentium the 586?
A: Because they added 486 and 100 on the first Pentium and got 585.999983605

There's no such thing as actual AI

>bash command
>he's about to press the left mouse button

This triggers me

The lack of --no-preserve-root doesn't trigger you?

Its the right mouse button actually, with his left hand.

our mistakes are also "programmer fault" so you're asking if computers can act irrationally? yes if you program it to act irrationally (like we're programmed) they could.

think todays computers are as complicated as only 1 human cell atm. in the future they'll be as complex as our whole bodies now.

so when we make a "mistake" today we decide with our too complex to analyze bodies.

I hope this changes your perspective.

Can a toaster make a mistake?

>Cred Forums humor
That's the real joke.

Cred Forums has you covered

welp

well. we're not very different than the rest of the universe. we're like "organic computers" which are programmed to create new possibilities(like the rest of the universe) just on another lvl and -like everything else- by evolution.

you see, now we make inorganic computers ourselves but they are too simple compare to us in todays technology. we can only compare them to viruses or maybe one cell organisms, bacterias atm.

now if we can make nano computers at microscopic lvls then if they act together in a system, then we can compare them to our consciousness. which is the same thing or at least works very similar to the organic ones.

our "ego" clouds our minds. so we think ourselves as "unique" but we're just like anything else(made from atoms). universe itself don't give a fuck if we wiped out. so we create "values" ourselves and put human beings on top of everything. which is very very egoistic thing to do.

we're just another stepping stone at the evolution that's all.

...

...

complexity of algoritms

Error is a made up concept. Even humans don't make errors, it's simply how all the body physics worked out.

Assuming that you're not including programmer mistakes on hardware (CPU bugs), yes, CPU degradation can cause CPUs to misfire instructions.

But if you're asking whether a perfectly implemented instruction set can ever misfire on a perfectly working chip, no. This is why CPU error correction checking isn't a thing on modern processors.

That's dumb, O(n/2) is the same as O(n)

>O(n/2)
wtf

Wow, we have a lot in common! We should be friends!

I don't have friends...

It's not

He needed another spot to fit.

You choose computer algorithm in part based on how fast its need for resources increases equivalent to the size of data you feed it.

>humor
Nice one, kek

>O(n/2) and O(n) are somehow different

I don't know how you arrived at that conclusion, but I'm down

It is, the constant is ignored in big o. 1/2 is a constant.

>not using O(1/n)

>the constant is ignored in big o

I'm from MIT, it is not ignored. Maybe on some backwards colleges but not here

yeah, they make thousands of mistakes every second, but they try over and over until it just works.

Yes it is as the functions are considered when the input is very large (as x approaches infinity).

Wow mit has really gone down the shitter. Hear they don't even teach scheme to you pajeets anymore.

>O(n/2)

Did you mean to type "ITT"?

>I'm from MIT, it is not ignored.
Constant is ignored because of the definition of big o notation
en.wikipedia.org/wiki/Big_O_notation?oldformat=true#Example.

I can't believe somebody did this IRL

Just curious as a non programmer with a passing interest.

I understand that the idea is that if the random number divisible by 6 equals zero that the delete command will run, or else display the printout, but where is the if/else statement here? Is that just how it's done in bash?

Or is this code just bullshit? Seems like it's missing something.

[ x ] && y || z
if x then do y, else do z

Ah thanks. It's a bash sytax thing then.

It's more like making use of short-circuiting to form an if-else statement. If the left side of && evaluates to false, then there's no point in evaluating the right side, because the reault of "and" is already determined to be false.

The usual way of writing if-else in bash looks pretty much as you'd expect:
if [ x ]; then
y
else
z
fi

It's common to a lot of C-like languages, often with different punctuation. If you want to be explicit, you can do
if [x]
then
y
else
z
fi

Thanks man.

Last question, is the left side in the example evaluating that the random number is divisible by 6 without remainder (the $ == 0)

Yes, but I'm not sure if the syntax is correct. I'm used to writing $(($random%6)), not with [ ].

Cool. Having the number = 0 if divisible by 6 was a retarded line of thought on my part hehe. It would in essence be a 1/ 32767 chance. A lot less sporting than just having $ % 6.

Thanks for the explanation man.

Since computers are made up of hardware, yes. Computers are always liable to error.

I guess you may classify it as a programming error but there plenty of cases of having to work around limitations in decimal division because of the way binary works. Is that an error?

A computer will do exactly what it's programmed to do. If the program is flawed, it will make a mistake. Computer hardware can also be flawed, which causes it to execute the program incorrectly.

It's impossible to make a perfect computer if that's what your question is about.

>Hey Cred Forums, can a computer ever make a real mistake? I don't mean user error, programmer fault, or alpha particle flipping from cosmic rays. Just true mistakes made by computers.

en.wikipedia.org/wiki/Soft_error#Causes_of_soft_errors

he specifically excluded these kinds of external disturbances

reminds me of a Charles Babbage quote;
>On two occasions I have been asked, — "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

calling a programmer or hardware designer error a mistake on the computers' part doesn't make sense, the computer will come to the same result every time you run a flawed program or on flawed hardware, it's still doing exactly what you told it to do

if you misspell a word on a keyboard, did the keyboard make a mistake?

>Hey Cred Forums, can a computer ever make a real mistake?
Conceptually with no outside interference? No.
Errors are caused by outside influence.

Can someone explain that to me? The formulas, not the deep level shit

As you go down the chart, the formulas have slower and slower growth rates. If you need to test something a bunch of times I'm sure you'd much rather have a linear amount of complexity rather than exponential.

Cool, i thought there was more to that, like it was a formula for some equation. Thanks user