为什么 SparkJava 不处理同一连接上的第二个请求?

Why does SparkJava not process the second request on the same connection?

我使用 SparkJava 编写了一个带有 REST-API 的小型服务器。我尝试使用 Apache Httpclient 查询 REST-API。使用此客户端,我打开一个连接并向服务器发送第一个请求并接收响应。然后我重新使用同一个连接向服务器发送第二个请求。请求已传输,但服务器不处理它。有谁知道,我做错了什么?

这是一个最小的工作示例:

Maven 依赖项:

        <dependency>
            <groupId>com.sparkjava</groupId>
            <artifactId>spark-core</artifactId>
            <version>2.9.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.httpcomponents.client5</groupId>
            <artifactId>httpclient5</artifactId>
            <version>5.0.3</version>
        </dependency>

服务器class:

package minimal;

import spark.Spark;

public class Server {

  public static void main(String[] args) {
    Spark.post("/a", (req, resp) -> {
          resp.status(204);
          return "";
        });
    Spark.post("/b", (req, resp) -> {
          resp.status(204);
          return "";
        });
    Spark.before((req, res) -> {
          System.out.println("Before: Request from " + req.ip() + " received " + req.pathInfo());
        });
    Spark.after((req, res) -> {
          System.out.println("After: Request from " + req.ip() + " received " + req.pathInfo());
        });
  }
}

客户class:

package minimal;

import java.io.IOException;

import org.apache.hc.client5.http.classic.methods.HttpPost;
import org.apache.hc.client5.http.impl.classic.CloseableHttpClient;
import org.apache.hc.client5.http.impl.classic.CloseableHttpResponse;
import org.apache.hc.client5.http.impl.classic.HttpClients;

public class Client {

  public static void main(String[] args) throws IOException {
    try (CloseableHttpClient httpclient = HttpClients.createDefault()) {
      HttpPost httpPost1 = new HttpPost("http://localhost:4567/a");
      try (CloseableHttpResponse response1 = httpclient.execute(httpPost1)) {
        System.out.println(response1.getCode() + " " + response1.getReasonPhrase());
      }

      HttpPost httpPost2 = new HttpPost("http://localhost:4567/b");
      try (CloseableHttpResponse response2 = httpclient.execute(httpPost2)) {
        System.out.println(response2.getCode() + " " + response2.getReasonPhrase());
      }
    }
  }
}

服务器控制台输出:

Before: Request from 127.0.0.1 received /a
After: Request from 127.0.0.1 received /a

这里是 tcpdump 的缩短输出:

14:52:15.210468 IP localhost.44020 > localhost.4567:
POST /a HTTP/1.1
Accept-Encoding: gzip, x-gzip, deflate
Host: localhost:4567
Connection: keep-alive
User-Agent: Apache-HttpClient/5.0.3 (Java/1.8.0_282)

14:52:15.271563 IP localhost.4567 > localhost.44020:
HTTP/1.1 204 No Content
Date: Tue, 27 Apr 2021 12:52:15 GMT
Content-Type: text/html;charset=utf-8
Server: Jetty(9.4.26.v20200117)

14:52:15.277376 IP localhost.44020 > localhost.4567:
POST /b HTTP/1.1
Accept-Encoding: gzip, x-gzip, deflate
Host: localhost:4567
Connection: keep-alive
User-Agent: Apache-HttpClient/5.0.3 (Java/1.8.0_282)

此后服务器没有再记录响应。

这是客户端示例,请试用一下,看看是否适合您。 我测试了它,它工作正常。

import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;

import java.io.IOException;

public class T1 {

   static void runPost(CloseableHttpClient c,String s)
    {
            HttpPost httpPost1 = new HttpPost(s);
            try(CloseableHttpResponse response1 = c.execute(httpPost1)) {
                System.out.println(Thread.currentThread().getName() + ": " +
                         response1.getStatusLine().getStatusCode() + " " +
                         response1.getStatusLine().getReasonPhrase());
            } catch (Exception e) {
                e.printStackTrace();
            }
    }

    public static void main(String[] args) throws IOException {

        try(CloseableHttpClient httpclient = HttpClients.createDefault()) {
            T1.runPost(httpclient, "http://localhost:4567/a");
            T1.runPost(httpclient, "http://localhost:4567/b");
        }
        System.exit(0);
    }
}

SparkJava服务器处理缺失的原因是我在项目中有以下额外的maven依赖:

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.0.7.1.1.0-565</version>
        </dependency>

删除此依赖项后,SparkJava 服务器将按预期工作。