当前位置: 首页 > 工具软件 > Wonder.js > 使用案例 >

Node.js is Cancer

谢胤
2023-12-01
Node.js is Cancer
 先转一下原文(原文地址似乎已不存在了http://teddziuba.com/2011/10/node-js-is-cancer.html):
by Ted Dziuba on Saturday, October 01, 2011

If there's one thing web developers love, it's knowing better than conventional wisdom, but conventional wisdom is conventional for a reason: that sh*t works. Something's been bothering me for a while about this node.js nonsense, but I never took the time to figure it out until I read this butthurt post from Ryan Dahl, Node's creator. I was going to shrug it off as just another jackass who whines because Unix is hard. But, like a police officer who senses that something isn't quite right about the family in a minivan he just pulled over and discovers fifty kilos of black horse heroin in the back, I thought that something wasn't quite right about this guy's aw-shucks sob story, and that maybe, just maybe, he has no idea what he is doing, and has been writing code unchecked for years.

Since you're reading about it here, you probably know how my hunch turned out.

Node.js is a tumor on the programming community, in that not only is it completely braindead, but the people who use it go on to infect other people who can't think for themselves, until eventually, every asshole I run into wants to tell me the gospel of event loops. Have you accepted epoll into your heart?
A Scalability Disaster Waiting to Happen

Let's start with the most horrifying lie: that node.js is scalable because it "never blocks" (Radiation is good for you! We'll put it in your toothpaste!). On the Node home page, they say this:

    Almost no function in Node directly performs I/O, so the process never blocks. Because nothing blocks, less-than-expert programmers are able to develop fast systems.

This statement is enticing, encouraging, and completely f*cking wrong.

Let's start with a definition, because you Reddit know-it-alls keep your specifics in the pedantry. A function call is said to block when the current thread of execution's flow waits until that function is finished before continuing. Typically, we think of I/O as "blocking", for example, if you are calling socket.read(), the program will wait for that call to finish before continuing, as you need to do something with the return value.

Here's a fun fact: every function call that does CPU work also blocks. This function, which calculates the n'th Fibonacci number, will block the current thread of execution because it's using the CPU.

function fibonacci(n) {
  if (n < 2)
    return 1;
  else
    return fibonacci(n-2) + fibonacci(n-1);
}

(Yes, I know there's a closed form solution. Shouldn't you be in front of a mirror somewhere, figuring out how to introduce yourself to her?.)

Let's see what happens to a node.js program that has this little gem as its request handler:

http.createServer(function (req, res) {
  res.writeHead(200, {'Content-Type': 'text/plain'});
  res.end(fibonacci(40));
}).listen(1337, "127.0.0.1");

On my older laptop, this is the result:

ted@lorenz:~$ time curl http://localhost:1337/
165580141
real 0m5.676s
user 0m0.010s
sys 0m0.000s

5 second response time. Cool. So we all know JavaScript isn't a terribly fast language, but why is this such an indictment? It's because Node's evented model and brain damaged fanboys make you think everything is OK. In really abusive pseudocode, this is how an event loop works:

while(1) {
  ready_file_descriptor = event_library->poll();
  handle_request(ready_file_descriptor);
}

That's all well and good if you know what you're doing, but when you apply this to a server problem, you've pluralized that sh*t. If this loop is running in the same thread that handle_request is in, any programmer with a pulse will notice that the request handler can hold up the event loop, no matter how asynchronous your library is.

So, given that, let's see how my little node server behaves under the most modest load, 10 requests, 5 concurrent:

ted@lorenz:~$ ab -n 10 -c 5 http://localhost:1337/
...
Requests per second:    0.17 [#/sec] (mean)
...

0.17 queries per second. Diesel. Sure, Node allows you to fork child processes, but at that point your threading/event model is so tightly coupled that you've got bigger problems than scalability.

Considering Node's original selling point, I'm God Damned terrified of any "fast systems" that "less-than-expert programmers" bring into this world.
Node Punishes Developers Because it Disobeys the Unix Way

A long time ago, the original neckbeards decided that it was a good idea to chain together small programs that each performed a specific task, and that the universal interface between them should be text.

If you develop on a Unix platform and you abide by this principle, the operating system will reward you with simplicity and prosperity. As an example, when web applications first began, the web application was just a program that printed text to standard output. The web server was responsible for taking incoming requests, executing this program, and returning the result to the requester. We called this CGI, and it was a good way to do business until the micro-optimizers sank their grubby meathooks into it.

Conceptually, this is how any web application architecture that's not cancer still works today: you have a web server program that's job is to accept incoming requests, parse them, and figure out the appropriate action to take. That can be either serving a static file, running a CGI script, proxying the connection somewhere else, whatever. The point is that the HTTP server isn't the same entity doing the application work. Developers who have been around the block call this separation of responsibility, and it exists for a reason: loosely coupled architectures are very easy to maintain.

And yet, Node seems oblivious to this. Node has (and don't laugh, I am not making this sh*t up) its own HTTP server, and that's what you're supposed use to serve production traffic. Yeah, that example above when I called http.createServer(), that's the preferred setup.

If you search around for "node.js deployment", you find a bunch of people putting Nginx in front of Node, and some people use a thing called Fugue, which is another JavaScript HTTP server that forks a bunch of processes to handle incoming requests, as if somebody maybe thought that this "nonblocking" snake oil might have an issue with CPU-bound performance.

If you're using Node, there's a 99% probability that you are both the developer and the system administrator, because any system administrator would have talked you out of using Node in the first place. So you, the developer, must face the punishment of setting up this HTTP proxying orgy if you want to put a real web server in front of Node for things like serving statics, query rewriting, rate limiting, load balancing, SSL, or any of the other futuristic things that modern HTTP servers can do. That, and it's another layer of health checks that your system will need.

Although, let's be honest with ourselves here, if you're a Node developer, you are probably serving the application directly from Node, running in a screen session under your account.
It's F*cking JavaScript

This is probably the worst thing any server-side framework can do: be written in JavaScript.

if (typeof my_var !== "undefined" && my_var !== null) {
  // you idiots put Rasmus Lerdorf to shame
}

What is this I don't even...

tl;dr

Node.js is an unpleasant software library and I will not use it.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------

然后有个叫Brady的进行了回应:

http://www.uberbrady.com/2011/10/nodejs-is-not-cancer-you-are-just-moron.html

Node.js is not a cancer, you are just a moron

My tone is going to seem strangely even and un-ranty. This is because I am doing everything I can to keep myself from completely exploding when I read this bullsh*t that this moron is spewing. OK, that was a little ranty, but the rest will read evenly. Maybe.

So one of my programming friends posts an article at http://teddziuba.com/2011/10/node-js-is-cancer.html and says, "Ah, here's what's wrong with Node.js!"

The article is rather strongly written - "Node.js is Cancer", "node.js nonsense", "Node.js is a tumor on the programming community", "completely braindead", "Scalability disaster", etc.

He then shows a Fibonacci sequence and how it performs badly under node.

The problem he has proposed is, fundamentally, CPU-bound. I wrote a version of it in C and it did perform faster than it did in Node, but still, the problem definitely took finite-time. My command-line Node.js version calculated the answer in 8 seconds, the C version did it in 4. I was rather impressed that Javascript (Node.js's V8 engine) was able to come as close to C's performance in pure CPU-bound execution.

The problem, and what the author perhaps misunderstands, is that this is not the situation in which Node is an ideal solution. I use Node.js in production for work - and I know of many other shops that do too. If the problems you are dealing with are CPU-related, Node.js will not help you. Node.js works well when your problems are I/O-related -e.g., reading something out of a database, running web servers, reading files, writing files, writing to queues, reading from queues, reading from other web services, aggregating several web services together, etc. The reason that this solution has become so popular of late is because these are the types of problems that are most common in web development today. Thus, node.js becomes a helpful arrow in one's quiver with which to solve these types of issues.

Considering that the article's author seems to have some level of experience, I wonder if his choice of skewed example was perhaps deliberate. He has other articles on his blog about other event-loop libraries. His comment at the bottom - "tl;dr - Node.js is an unpleasant software library and I will not use it" - is possibly the real source of his anger. And - an irrefutable point - if you don't like something, you don't want to use it, and he obviously doesn't. That's fine.

Node is a tool; one of many - no panacea. If you're dealing with problems of 'slow' services that need to wait for various bits of I/O to complete in order to return a result - it can be a very powerful and useful tool. If you're computing the fortieth member of the fibonacci sequence recursively, it won't be.

The sad fact is that the author's completely valid point - that Node.js isn't a good tool for CPU-bound problems - is completely buried in his bile. This is because he never states that, explicitly. Node.js has other drawbacks as well - it's very easy to end up in callback-spaghetti, it's very minimal, and it's very very very young. The database integration libraries have some pretty serious immaturity issues to work through; and I've had to code around a good deal of that.

It's a tool that's good at particular things, and I will continue to use it for those things. Those 'things' tend to be the bulk of what web development and web services development actually are. So when I can write a two hundred line program that can replace entire arrays of servers and interconnected services with just one server; I am going to do that, and I won't feel particularly braindead in doing so.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------
整个来龙去脉是这样:
Ted Dziuba质疑Node.js的"非阻塞"的说法,举了一个求斐波那契数的例子,在一个求斐波那契数的方法里,花了5秒多的时间。而在调用这个方法的时候,线程是被阻塞了。
然后以此为事实,Ted进行了引申,严重的喷了Node.js和JavaScript。。。

然后Brady在他的Blog里进行了还击:
他 用自己的机器分别用JavaScript和C跑了下斐波那契数方法,分别耗时8秒和4秒,首先惊诧于JS引擎已接近了C。然后阐述了Node.js适合做 什么和不适合做什么。他觉得Node.js适合做的是IO相关的操作,而不适合严重依赖CPU的事情。攻击Node.js的人只是搞错了应用场景。 Node.js的“非阻塞”只是对于IO来说的,进行很耗CPU计算的操作还是会阻塞住线程,这个时候Node.js的单线程模式的不足就会暴露出来。

如何趋利避害呢?Brady已给出了提示,JavaScript语言非常适合回调方式的开发。所以,如果有耗时的方法,都要做成异步调用。

转载于:https://www.cnblogs.com/crabb/p/3449049.html

 类似资料:

相关阅读

相关文章

相关问答