Brain one way, but other brain other way. Chemical stuff is making brain stuff happen. Makes see different.
Brain one way, but other brain other way. Chemical stuff is making brain stuff happen. Makes see different.
Companies try to maximize green per red. By paying less, and getting the same, they maximize that, year after year until (in a temporary and unforeseeable setback) you leave for… Bluer pastures, apparently.
There are different sorts of companies, and the more they think of employees as a number of years of experience plus a stack of skills, the more susceptible they are to believing that replacing humans with other equally skilled humans is a productive way to spend their time.
If you take the sun out of the equation, the planets fly apart in all directions. Hope that helps ;)
You can just issue new certificates one per year, and otherwise keep your personal root CA encrypted. If someone is into your system to the point they can get the key as you use it, there are bigger things to worry about than them impersonating your own services to you.
I wanted to know how important this really would be. Human reaction times among gamers are on the order of 150-300 ms, and professional gamers mostly manage 150-200 ms. A view refreshing 700 times per second gives a new frame every 1.4 ms, while a view refreshing 60 times per second gives a new frame every 16.6 ms.
In a reaction timing heavy game, this would not be enough to bridge the gap between the fastest in the world and the slowest professionals, but it’s on the right order of magnitude to make a difference in professional level play, up against a 60 Hz display. On the other hand, it’s only a marginal step up from a 240 Hz display, and the loss in resolution must have an effect at some point.
There’s probably games where this is better, but only when the difference is small, or the other display is handicapped.
“You wouldn’t put on a tricorn hat, would you?”
I actually would, if I could find a nice one…
“…and leave your job to sail the seas?”
… That’s an option? I didn’t even consider-
“And you certainly wouldn’t drink rum, and fire cannons, and carry a saber and tell silly parrot related puns.”
buys a tricorn hat
Last time I tried freecad, the geometry solver was incorrect, so it would sometimes create two (or more) shapes from a fully constrained part. Since learning about openSCAD, I’ve seen no reason to give it another try.
I’ve known a lot of math people, and /on average/ I think they’re more capable of programming useful code than the other college graduate groups I’ve spent a lot of time working with (psychology, economics, physics) /on average/.
That said, the best mathematicians I’ve known were mostly rubbish at real programming, and the best programmers I’ve known have come out of computer engineering or computer science.
If you need a correct, but otherwise useless implementation, a mathematician is a pretty good bet. If you need performance, readability, documentation, I’d look elsewhere most of the time.
We have only a single advanced civilization to use as a comparison point for the strength or our telescopes, and that’s ourselves. From my understanding of it, the most powerful broadcast we’ve made out is 15-20MW for an over the horizon radar system, and that only ran for 40 years or so. I don’t have an exact answer, but my understanding is that even for our largest radio telescope, 20 megawatts at a distance of 100 lightyears would be below the noise floor.
Nuclear tests are slightly more visible than that, but only occur periodically, so you’d have to have a telescope facing the right way by coincidence. Basically, if there’s an Earth-like civilization 200 lightyears away, I think we would be entirely blind to it, and that’s over a tiny distance in the scheme of things.
The farthest known exoplanet is 27,710 lightyears away, and was discovered by the transit method - but this was made possible because the planet is very big (bigger and heavier than Jupiter), and orbits quite close to its star (with a 43 hour orbit). To be detectable at that range, a signal has to be stronger than some stars are bright.
It depends on how much other people care about your data, and how much physical control you have over the devices. If you’re in “nation states would like to have it” territory, you should never have unencrypted data at rest, or on the wire. If you’re a regular home user and all your computer stuff is inside your own house, you’re probably fine. In between, there’s a lot of possibilities. Encryption is cheap.
An r320 is new enough that iDrac Express (two IPs on one interface) is available in the BIOS. The server needs a license for some features (like remotely attaching an ISO, and remote KVM), but not for the basics like controlling the power.
It sure loads quickly, though
When a monopoly is faced with a smaller, more efficient competitor, they cut prices to keep people from switching, or buy the new competitor, make themselves more efficient, and increase profits.
When Steam was faced with smaller competition that charged lower prices, they did - nothing. They’re not the leader because of a trick, or clever marketing, but because they give both publishers and gamers a huge stack of things they want.