Let's discuss Mathematics

@Mise: Another thing that greatly improves the chance is that the success is correlated. In the sneaky guard configuration (2, 3, 4 .... 1), not a single prisoner has the right box. However, if a single prisoner is successful (that means he is in a cycle with length =< N/2), all prisoners in his cycle will be successful.

Link: http://www.sciencenews.org/view/generic/id/7649/title/Puzzling_Names_in_Boxes
 
This would suck as a problem in game theory. Everyone assumes everyone knows the best strategy already ;) That's why the global economy is monkeyfrapped.
 
Yeah, but the probability of that occurring is 0 though, so ;)

EDIT: Unless one of the probabilities is 0 also.

EDIT2: The meaningful question that can be asked of course is - "what is the probability that this process will not terminate after a certain number of iterations?", and as long as the probabilities aren't 0 and 1, this tends to 0 as the number of iterations tends to infinity.
Sometimes, for real-time software applications that's not good enough. you need guarantees that an operation will terminate within a specific number of cycles. You could force a winner after a finite amount of iterations, but then you are sacrificing randomness.

Now in this application, it would make more sense to make a better coin. But that doesn't work if you vary the desired outcome distribution.
 
The prisoner's box problem is quite cool. And dangit I knew the answer to this one too off of the top of my head and could have just jumped in and answered it earlier if I'd been around, falls into the "heard the problem before and you can't trick me just by rewording it" category though.

In related discussion - I nearly drove myself crazy over a couple days over just the past winter working on a random problem (don't know if it corresponds to anything known/real out there) that I'm also not sure has any solutions anyway, but where I was looking at similar ideas. Don't want to waste anyone's time here, just a summary of the thought was it was a problem where each person was given a unique number (originally I was thinking playing cards but natural numbers sounded better) and the goal was to get everyone in the group to know everyone's number in a certain number of rounds of communication under certain rules, so what algorithm should they follow. But this permutation problem certainly sprang to mind for me very quickly cause I'd heard it before, oh well.

edit-

"I saw what you did there!", he ejaculates.

already pointed out in part but please tell me this was intentional! :rotfl:
 
Sometimes, for real-time software applications that's not good enough. you need guarantees that an operation will terminate within a specific number of cycles. You could force a winner after a finite amount of iterations, but then you are sacrificing randomness.

Now in this application, it would make more sense to make a better coin. But that doesn't work if you vary the desired outcome distribution.

Tut tut, you said it may never terminate ;) As long as 0 < p < 1, it will, assuming we have infinite time available.

I brought up the "terminate within a set number of iterations" thing ;) Don't start backtracking now! This is why maths > computer science!
 
Tut tut, you said it may never terminate ;) As long as 0 < p < 1, it will, assuming we have infinite time available.

I brought up the "terminate within a set number of iterations" thing ;) Don't start backtracking now! This is why maths > computer science!
You brought up diminishing probability that the number of rounds is n, but that doesn't help in the example I outlined. There is in that case a non-zero chance that the device will not function properly because the number of rounds is greater than the max number of rounds tolerated by the device.

Math is fun, but it's useless unless applied.
 
I'm sure PS and Dutchfire are ready to sacrifice their lives by flipping coin for the rest of their lives, just to use mathematically sound method.

The expectation of number of flips is 1/[p(1-p)]], if I remember correctly (it's very easy to check anyways). So even with a coin that landed tails only 1/100 of the time E(X) would be 100*100/99 ~101.

The probability of not reaching conclusion after 1000 flips would be (1/10000 +9801/10000)^500 ~ 0.0000454 (if I can read the calculator right).

already pointed out in part but please tell me this was intentional!

To the point that I used often repeated sentence. I don't really know where it originates from, and thus if there's something more to it, that's not intentional. ;)
 
You brought up diminishing probability that the number of rounds is n, but that doesn't help in the example I outlined. There is in that case a non-zero chance that the device will not function properly because the number of rounds is greater than the max number of rounds tolerated by the device.

Math is fun, but it's useless unless applied.

You'd need to add a cutoff and pick an arbitrary winner (alternate p1/p2 say, or whoever out of p1/p2 had less wins overall).

I do know about real time programming - I am a games programmer after all ;) You obviously wouldn't implement an algorithm like that if time was critical anyway, without a failsafe mechanism.

We could say penalty shootouts in football (soccer) are useless because there is a non-zero chance the process will not terminate before the heat death of the universe - but we still use them.
 
I decided mathematics is a waste of time... I'm going to put my energies into a branch of study with more use.

Meet IdiotsOpposite the Liberal Arts major!
 
:confused: You decided to drop maths the same day Riemann hypothesis was announced to be proven? :(

Here's one of my favourite limericks that touches the subject:

There was a young fellow of Trinity
Who found square root of infinity
The number of digits
Gave him the fidgets
So he dropped math and took up divinity.
 
Isn't that announced as proven every April 1st? ;)
 
I decided mathematics is a waste of time... I'm going to put my energies into a branch of study with more use.

Meet IdiotsOpposite the Liberal Arts major!

Everyday I wonder what it would be like to be, say, a poetry major. Sitting outside on a bench under some tree or in the grass chilling with my latte, maybe typing something about zeitgeists on a small mac notebook...sigh

Atticus - wake me up when they prove the Collatz conjecture.
 
So, I was fiddling around on my graphing calculator, exploring 0^0, and whether it really is 1. It turns out the limit for x^x as x approaches to 0 is 1. If x is negative, there is an imaginary component, which approaches 0 as x approaches 0. Nothing too unexpected there. But, that's not the whole story.

As x grows closer and closer to zero, the imaginary component of x^x grows closer to x * pi. Ty it for yourself on any calculator equipped to handle imaginary numbers. For instance, if x = -10^-11, x^x = (1.0000000002532843-3.141592654385509E-11i)
If x = -10^-20, x^x = (1-3.141592653589793E-20i)

Try it for yourself! Can anyone with more than my pitiful amount of calculus explain to me why this is so? Is it somehow linked to Euler's Identity?
 
Just wanted to say that the whole "liberal arts" thing was my lame attempt at an April Fool.
 
Yeah admiral-bell that's pretty straightforward.

You can represent a negative x as

|x|e^(j*pi)

So then raising that to the -x power you get

|x|^(-x) as the magnitude which also approaches one as your limit, least I think that's right. edit - yeah, approaches one from above though as an inverse of course so that's why you're getting magnitudes slightly greater than one.

and then you can multiply the exponential powers easily so e^(-j*pi*x)

And as x is really small that's essentialy cos(0) = 1 + jsin(-pi*x) ~ -j*pi*x for a small angle sine.
 
That's why 0^0 is undefined.

It has to tend to 1 along all paths to be able to effectively define it as 1. (As sin x/x does, for example).

Straight line only paths aren't sufficient either.
 
Given the pi discussion in the science and technology thread, I'll pose this problem:

Show that there exist more (in the standard set-theoretic sense of "cardinality") points along the edge of a circle than in the sequence (4, 11/3, 56/15, ...) which approximates pi.
 
Well the sequence has the same cardinality as the natural numbers.

Whereas the points on a circle is isomorphic to SO(2), the special orthogonal group, which is parametrized by [ 0, 2pi ) which has the same cardinality as the continuum.
 
Back
Top Bottom