Let's discuss Mathematics

another banal question

KBxrX1w.png


I'm supposed to transform to spherical coordinates. Is it simply to change x to r*cos(fi)*sin(theta) etc, change the limits to 0 to pi for fi and to 2pi for theta etc and multiply with the jacobi determinant r^2*sin(theta) or is there something I've gotten wrong or am forgetting?

also, I don't expect anyone to do this for me but just in case that someone wants to try it, the problem gives this identity as a hint
an6nj9w.png

though I expect that to be fairly easy to do once I manage to change to spherical coordinates

(which when I examine it very closely now using cos^2+sin^2=1 makes it very easy if my assumptions are right)
 
Last edited:
does anyone have any neat tricks on how to integrate something like a(x^2)sin^2(bx) (from minus to plus infinity)?
like it does seem symetrical so I can multilpy the integral from 0 to infinity by 2 and get the same result, but that doesn't help terribly much
 
I do not think it counts as a neat trick, but these days when I have to do something like this I use wolfram alpha. Their answer, use a series expansion, does not look very elegant.
 
yea
this is from a university class with weekly assignments and just based on that I doubt that's the way I'm meant to go

also for the record if it helps any this is trying to find the expectation value of x^2 in an infinite square well potential in quantum physics

I was able to say the the expectation value for x was zero because axsin^2(bx) is an antisymetrical function, but with x^2 this gets worse
 
managed to solve it with sin^2(x) = (1 - cos(2x)) / 2

actually not quite because I got an answer that's obviously wrong lol but this is pretty clearly the technique you need to use
 
If the limits are + and - infinity, it seems that your answer must be infinity - it is going to blow up as x gets big. However, your limits shouldn't be infinity if you are looking at a square-well potential, but instead 0 and L or +/- L/2.

As a student, I'd have looked it up in a table. Wolfram Alpha suggests converting sin^2 bx to 0.5 (1-cos(2bx)) to do the integral if you want to do it yourself, but it does give the answer. (Note: for me, it chokes on what Samson typed in, but it works fine when I type it in as "integrate ax^2sin^2(bx) dx" instead. I don't know why it doesn't like the extra parentheses.)

Edit: well, my help was too slow. Sorry.
 
alright check this set of equations

a^2+bc=1

d^2+bc=1

b(a+d)=0

c(a+d)=0

b=c* (the problem starts assuming all of a, b, c and d are complex, but through verious unrelated stuff it turns out a and d have to be real, and like I'm certain here b and c also are real so b=c really but I technically don't know that yet)

so consider a+d=0 (because b=c=0 I've figured out)

obviously a=-d and a^2=d^2

now twice I've felt for certain I've seen something that means a or d have to be 1 in this case, only to have it slip seconds later

but like how do I solve this, I guess is my question
 
I assume the set is
a^2+bc=1

d^2+bc=1

b(a+d)=0

c(a+d)=0

If you're looking at a+d=0 then b and c can be whatever you want. There are many quadruplets of solutions, like a=sqrt(2), d=-sqrt(2), b=i and c=i or (if you prefer real solutions) a=sqrt(2), d=-sqrt(2), b=1 and c=-1. It's probably possible to make a 3D visualization of which a, b, c triplets work (d and a being linked d is not needed).

If b=c then bc=b² and the first two equations look much better.
 
I also have that b and c are eachothers complex conjugate, so identical if they're real, so those solutions you suggested can't fit.

The problem is I can't just assume that b and c are real, and I still can't figure out how to decide their values when assuming a+d=0
 
a=0, b=i, c=-i, d=0 works right ? Which means b and c aren't necessarily real ?

Edit : if you limit yourself to b and c being purely imaginary thenfor 0<k<1 b=k*i, c=-k*i and a=sqrt(1-k²) works in general
 
especially since assuming a+d=0, I'm essentially left with

a^2+bc=1

d^2+bc=1

and since a^2=d^2 that's essentially just one equation with two unknowns

I mean I still have b=c*, but like

I could try like b=b(real)+ib(imag) I guess
 
asked for help, so in case people are curious, the answer is a and d in cosinus form and b and c in sinus form (either real or imaginary for the latter)
 
does the like infinite sum or whatever of 1/n! converge to something?

that is like 1+1/2!+1/3!+1/4!+...

I know it's kinda important to like figure out how and why these kinds of things are such and such but right now I kinda just need to know if it does and what it is

edit: 2 seems like a reasonable guess, but I'm not sure

edit2: or maybe it's more 1.75, but that's just from adding up until n=10
 
pWbj9Ph.png


Given that one in 2.4, I'm supposed to get 2.5, but I can't nomatter what I try
Stirling's approximation here is N!= N^N times e^-N times square root (2 pi N)

thing is the N^N for N+ or - N seems uncrackable, and the e-factors dissapear, even though they're supposed to be there at the final answer
 
Back
Top Bottom