Thank you for both of those links. Is there any possible difference between textContent and createTextNode?
Some other advice:
- Unless you're using IE at this point I'd get into the practice of using "let" and "const" instead of "var". "Var" behaves in weird ways (look up "variable hoisting" for more), "let" works like you probably expect, "const" of course can't be changed once set.
- Your loop is from 0 to 5 but what you really want is from 0 to bTypes.length.
- Even better, you can do bTypes.forEach(function(bType){ ... }).
You can technically have multiple text nodes on an element (some artifact of the way the DOM was designed). textContent will get you the content of all of them combined or if set will wipe all of them and set just one: https://jsfiddle.net/v0vxgd39/
#javascript truly is the world's most beautiful language
null >= 0
→ true
null > 0
→ false
null == 0
→ false
Ah thanks.
I'll ask my professor about using let/const instead of var. Dunno if they use some auto grading thing or what.
Also is there any real problem with looping to 5 if I know I'll be looping that many times? Or is it just bad to get into those sorts of habits?
https://twitter.com/tom7/status/752249842374733825
Code:#javascript truly is the world's most beautiful language null >= 0 → true null > 0 → false null == 0 → false
How does this happen?
story
story
How, like "for what reason"?https://twitter.com/tom7/status/752249842374733825
How does this happen?
I'm interested in input myself. You can also post this on r/cscareerquestions, they're pretty good.
You could just take the A job and jump to B when you land it. Might or might not be possible depending on your contract. Might look bad for future employers though, but most likely not.
You could accept the A job and try to push back the date as far as you can and then calling it quits ASAP you land the job B. A might not be willing to push back and you might lose the job anyway. Or you could call them and ask if you could think about this for two weeks. A might see you hesitating and might want to push up your benefits. Or they might just drop you.
You could take the A job and try to make best of the situation. Maybe they are incredibly bored of their product too.
Or you could just not take the A job and hope for the best a B. You really don't want to do that, trust me. Unless you have savings in line, in which case just go for it.
B just sounds incredibly risky, but then again there's nothing better than working at the place of your dreams, which I always think is worth going for. There's always another crappier job available for you if you don't manage to land that one yet.
Wasn't aware of this subreddit. Thanks, I'm gonna post it there.
Are you in a similar situation?
I'm interested in input myself. You can also post this on r/cscareerquestions, they're pretty good.
See this is the subreddit that I hate.
Every fucking thread: "Oh you want an internship as a student? I hope you have a very impressive resume with a bunch of impressive personal projects"
Interns see themselves are worthless programmers but companies actually see interns as incredibly cheap labor. If you show them you can write a program they will hire you.
I'm not, but I will be starting my fifth and last year in school this fall and questions like these are interesting to me.
Well, there's people that started later than that? ^_^who has been coding for fun ever since you were seven years old.
See this is the subreddit that I hate.
Every fucking thread: "Oh you want an internship as a student? I hope you have a very impressive resume with a bunch of impressive personal projects"
It depends. If you want to work at a Microsoft, Amazon or Google you better be one of the better programmers they are going to see for that position. Luckily for those of us who don't aspire quite so high there are tens of thousands of other companies that need good programmers.
edit: and you never answered my other post. What type of SQL did you need help with?
Well, there's people that started later than that? ^_^
More seriously, I'd be curious to know whether more or less people start coding at 6-8 y.o. now than 30 years ago. When you had a computer in the 80s, programming was quite natural (plently of games were "distributed" on paper through books and magazines, and programming was directly available, since computers were little more than code editors, without GUI... in fact, 80% of the manual of my first computer was a Basic 1.0 reference manual. Also, there wasn't many interesting things to do with a computer).
Also, coding books for young children were common. After 1990, I've barely seen any coding book for ~7 y.o.
They're again more common now (mainly Python, and curiously I'd say, Javascript). Also, things like Arduino, Mindstorms and co help code entering schools... and governments are interested in coding now.
So, out of curiosity, how old were you when you started? Basic and Logo at 7 for me, ASM at 9, Pascal at 12 IIRC.
(I don't think that matter, although I think it's easier when you're young, like foreign languages. I'm just curious, barely any of my students have started young, when almost all of my friends interested in computers started before 10, and I've been wondering whether it's a )
Well, there's people that started later than that? ^_^
More seriously, I'd be curious to know whether more or less people start coding at 6-8 y.o. now than 30 years ago. When you had a computer in the 80s, programming was quite natural (plently of games were "distributed" on paper through books and magazines, and programming was directly available, since computers were little more than code editors, without GUI... in fact, 80% of the manual of my first computer was a Basic 1.0 reference manual. Also, there wasn't many interesting things to do with a computer).
Also, coding books for young children were common. After 1990, I've barely seen any coding book for ~7 y.o.
They're again more common now (mainly Python, and curiously I'd say, Javascript). Also, things like Arduino, Mindstorms and co help code entering schools... and governments are interested in coding now.
So, out of curiosity, how old were you when you started? Basic and Logo at 7 for me, ASM at 9, Pascal at 12 IIRC.
(I don't think that matter, although I think it's easier when you're young, like foreign languages. I'm just curious, barely any of my students have started young, when almost all of my friends interested in computers started before 10, and I've been wondering whether it's a )
My linear algebra teacher is teaching us stuff that he says has applications in things like signal processing, signal noise compression, quantum mechanics, etc. Frankly, a lot of this stuff is really confusing to me. I can post my test review material in a bit to show you what I mean. To what extent should I be familiar with linear algebra as a programmer it isn't working in a relatively math-heavy programming job?
Dunno about conventional programming but my old elementary school started introducing Code Monkey-like programming stuff into computers class as sort of intro to programming, I was really surprised with what the little kids were tackling, it's part of their homework and everything. It's been a couple of years now since they started doing itWell, there's people that started later than that? ^_^
More seriously, I'd be curious to know whether more or less people start coding at 6-8 y.o. now than 30 years ago. When you had a computer in the 80s, programming was quite natural (plently of games were "distributed" on paper through books and magazines, and programming was directly available, since computers were little more than code editors, without GUI... in fact, 80% of the manual of my first computer was a Basic 1.0 reference manual. Also, there wasn't many interesting things to do with a computer).
Also, coding books for young children were common. After 1990, I've barely seen any coding book for ~7 y.o.
They're again more common now (mainly Python, and curiously I'd say, Javascript). Also, things like Arduino, Mindstorms and co help code entering schools... and governments are interested in coding now.
So, out of curiosity, how old were you when you started? Basic and Logo at 7 for me, ASM at 9, Pascal at 12 IIRC.
(I don't think that matter, although I think it's easier when you're young, like foreign languages. I'm just curious, barely any of my students have started young, when almost all of my friends interested in computers started before 10, and I've been wondering whether it's a )
I'd say it depends entirely on what you want to do. There are fields where advanced math is necessary (most research oriented fields, machine learning, computer vision, graphics programming, AI ...) and there are fields where you won't really be using it at all (webdev, enterprise programming ...). Basic vector and matrix operations pop up a lot though, but they are pretty easy to learn. There are definitely a lot of jobs out there where advanced math shouldn't be necessary.
Yes, that seems like a typical first Linear Algebra course to me, although the test is written quite obtusely.Here's what we're practicing on our midterm test to give you an idea. Does this seem more advanced than a typical first Linear Algebra course?
Here's what we're practicing on our midterm test to give you an idea. Does this seem more advanced than a typical first Linear Algebra course?
It's been a while since my Linear Algebra class, and I'm hardly an expert on how colleges from other countries teach things, but most of it seems pretty standard in theory. Maps, determining a vector space basis, vector subspaces, all the eigenvalue (and related) questions, GramSchmidt and kernels are all pretty basic LA concepts. The only thing I don't remember doing is CBS, although it's been a while so maybe I'm just forgetting.
The test does seem quite tricky though (as Kristoffer says, it's pretty obtuse). Out of curiosity, how much time do they give you for such a test?
We're given an hour and a half. I'm not complaining too much since his test is basically identical to the practice test, but with different values. So we get to essentially master the test before it.
I'm suddenly really grateful my math courses had way more lenient time constraints (i think it was an hour for 5 questions, which was plenty of time).
I can definitely see why things would be confusing for you though. In my course most of the questions were relatively straightforward practical problems (like questions 6, 9 or 13 on your example) which could be solved with well defined procedures that were thought at the course. The more theoretical questions on your exam seem way more confusing then that, and the time limit isn't all that lenient either. It also covers a lot of concepts for a midterm imo.
Don't let it discourage you though. If you haven't done a lot of math before the college level stuff can be really hard to get at first. And I have to say that even in math heavy programming courses I took, a lot of this stuff never popped up. I certainly can't remember ever using vector space theory on anything.
The funny thing is that Math is really just ultra unintuitive programming. It's extremely abstract, terse and esoteric. Even if you don't feel super confident I'd be willing to bet if presented with similar problems in a programming context you'd be much more likely to understand what is going on and where you don't you could probably fill in the gaps by walking through it. At the very least programming has the nice effect of being able to seal away weird things into a black box, and you just need to consume it. You can always peek into the box to satisfy your curiosity but you don't need to become an expert unless you want to.
I feel like mathematics pedagogy and perhaps science in general could do itself a favor by dropping as much of the obscure terminology as possible. I fail to see how the use of Greek letters makes anything clearer for any student than just using plain descriptive English words. Weights instead of theta, change instead of delta, etc.
I feel like mathematics pedagogy and perhaps science in general could do itself a favor by dropping as much of the obscure terminology as possible. I fail to see how the use of Greek letters makes anything clearer for any student than just using plain descriptive English words. Weights instead of theta, change instead of delta, etc.
I feel like mathematics pedagogy and perhaps science in general could do itself a favor by dropping as much of the obscure terminology as possible. I fail to see how the use of Greek letters makes anything clearer for any student than just using plain descriptive English words. Weights instead of theta, change instead of delta, etc.
You've gotta have a common terminology, both spoken and written, so that people who don't speak the same language can understand the same papers and concepts. That's why math is sometimes called the "universal language", and that's exactly how it should be. Imagine if instead of writing "Let ε > 0" you had this instead
Teacher: ok everyone think of a number, but it has to be a really really small number, just a liiiiiiiiitle bit bigger than 0.
Student: What do you mean small? How small?
Teacher: FUCKING small.
I think the former is much more descriptive and easy to understand, especially considering you might see the same phrase many times in the same proof.
Standard terminology is fine. Greek letters are a relic of the past. We shouldn't stick to the same traditions forever, should we? Imagine if we somehow still used the convention of adopting the shortest variable names possible.
Standard terminology is fine. Greek letters are a relic of the past. We shouldn't stick to the same traditions forever, should we? Imagine if we somehow still used the convention of adopting the shortest variable names possible.
We use Greek letters instead of entire words because multiple letters in a row look like multiplication. Having concision in formula writing is very, very useful.
Why not? It sounds like the argument to change pi into tau. Who cares? It's an abstract symbol that's been given a well defined meaning. What symbol it is doesn't matter, what matters is that you adopt a convention and stick to it. Changing centuries, hell even millenia of precedent just to be more futuristic like sounds a little extreme, don't you think? They're not that hard to learn. epsilon looks like a little e, delta looks like a curvy little d, lambda is an upside down y. Upper-case sigma is a big spikey E. I'm not sure what problem we're trying to solve.
Show me the differential form of Maxwell's Equations using a different system that is nearly as concise and readable.This is circular. If you have a different system it's not necessarily bound by the limitations of the old system. Certain oddities of expression in one programming language don't necessarily appear in others for example. Math does have explicit operators so it's not a fundamental change anyway.
This is circular. If you have a different system it's not necessarily bound by the limitations of the old system. Certain oddities of expression in one programming language don't necessarily appear in others for example. Math does have explicit operators so it's not a fundamental change anyway.
The problem is novice hostility and cognitive load (which can very easily turn into a means of oppression, especially passively). We can most definitely swap "Δ" with "change" and it doesn't make a difference to what you are doing but one is much more immediately understandable to someone who hasn't taken an upper-level math class. Even if you wanted to get into arguments of brevity 変 is more immediately understood by far more people than a Greek letter (western academic bias) but perhaps for inclusion you'd choose an emoji since they are the most universal language. But overall tradition has little basis without something more rigorous behind it, unless you can actually prove that one methodology is better than the other, that methodology should be challenged. It's natural to excuse away the faults of large systems with high investment, pretend they don't exists or claim "well not enough people have complained publicly" but this is not a great argument against alternatives.
Show me the differential form of Maxwell's Equations using a different system that is nearly as concise and readable.
Or not, because this is way, way OT
The argument against alternatives is that the benefits are immeasurably small and the cost is immeasurably high
The problem is novice hostility and cognitive load (which can very easily turn into a means of oppression, especially passively). We can most definitely swap "Δ" with "change" and it doesn't make a difference to what you are doing but one is much more immediately understandable to someone who hasn't taken an upper-level math class. Even if you wanted to get into arguments of brevity 変 is more immediately understood by far more people than a Greek letter (western academic bias) but perhaps for inclusion you'd choose an emoji since they are the most universal language. But overall tradition has little basis without something more rigorous behind it, unless you can actually prove that one methodology is better than the other, that methodology should be challenged. It's natural to excuse away the faults of large systems with high investment, pretend they don't exists or claim "well not enough people have complained publicly" but this is not a great argument against alternatives.
Seems more like an unsubstantiated assumption. The objective thing to do would be to test it which is what I'm suggesting we do but what we pretty much don't do despite people being science-minded. You need that human effort component because most people will fall into a sort of equilibrium.
Do I pick the dull, uninteresting job at the cool company A right now? Or do I risk that job for a presumably low chance of an amazing job at the incredible company B?
The wise decision is obviously company A, but, Jesus... B sounds so unbelievably great. Damn.