Swift:从 url 下载数据导致 semaphore_wait_trap 冻结

Swift: downloading data from url causes semaphore_wait_trap freeze

在我的应用程序中,点击按钮从 Internet 站点下载数据。该站点是包含二进制数据的 link 列表。有时,第一个 link 可能不包含正确的数据。在这种情况下,应用程序获取数组中的下一个 link 并从那里获取数据。 link 是正确的。

我遇到的问题是,当我点击按钮时,应用程序经常(尽管并非总是)冻结几秒钟。 5-30秒后,解冻并正常下载工具。我明白,有什么东西阻塞了主线程。在 xCode 中停止进程时,我得到这个(semaphore_wait_trap 注意到):

我是这样做的:

// Button Action
@IBAction func downloadWindNoaa(_ sender: UIButton)
    {
            // Starts activity indicator
            startActivityIndicator()

            // Starts downloading and processing data

            // Either use this
            DispatchQueue.global(qos: .default).async
                {
                    DispatchQueue.main.async
                        {
                            self.downloadWindsAloftData()
                        }
                }


            // Or this - no difference.
            //downloadWindsAloftData()
        }
    }

func downloadWindsAloftData()
    {
        // Creates a list of website addresses to request data: CHECKED.
        self.listOfLinks = makeGribWebAddress()

        // Extract and save the data
        saveGribFile()
    }

// This downloads the data and saves it in a required format. I suspect, this is the culprit

    func saveGribFile()
    {
        // Check if the links have been created
        if (!self.listOfLinks.isEmpty)
        {
            /// Instance of OperationQueue
            queue = OperationQueue()

            // Convert array of Strings to array of URL links
            let urls = self.listOfLinks.map { URL(string: [=13=])! }

            guard self.urlIndex != urls.count else
            {
                NSLog("report failure")
                return
            }

            // Current link
            let url = urls[self.urlIndex]

            // Increment the url index
            self.urlIndex += 1

            // Add operation to the queue
            queue.addOperation { () -> Void in

                // Variables for Request, Queue, and Error
                let request = URLRequest(url: url)
                let session = URLSession.shared

                // Array of bytes that will hold the data
                var dataReceived = [UInt8]()

                // Read data
                let task = session.dataTask(with: request) {(data, response, error) -> Void in

                    if error != nil
                    {
                        print("Request transport error")
                    }
                    else
                    {
                        let response = response as! HTTPURLResponse
                        let data = data!

                        if response.statusCode == 200
                        {
                            //Converting data to String
                            dataReceived = [UInt8](data)
                        }
                        else
                        {
                            print("Request server-side error")
                        }
                    }

                    // Main thread
                    OperationQueue.main.addOperation(
                        {
                            // If downloaded data is less than 2 KB in size, repeat the operation
                            if dataReceived.count <= 2000
                            {
                                self.saveGribFile()
                            }

                            else
                            {
                                self.setWindsAloftDataFromGrib(gribData: dataReceived)

                                // Reset the URL Index back to 0
                                self.urlIndex = 0
                            }
                        }
                    )
                }
                task.resume()
            }
        }
    }


// Processing data further
func setWindsAloftDataFromGrib(gribData: [UInt8])
    {
        // Stops spinning activity indicator
        stopActivityIndicator()

        // Other code to process data...
    }

// Makes Web Address

let GRIB_URL = "http://xxxxxxxxxx"

func makeGribWebAddress() -> [String]
    {
        var finalResult = [String]()

        // Main address site
        let address1 = "http://xxxxxxxx"

        // Address part with type of data
        let address2 = "file=gfs.t";
        let address4 = "z.pgrb2.1p00.anl&lev_250_mb=on&lev_450_mb=on&lev_700_mb=on&var_TMP=on&var_UGRD=on&var_VGRD=on"

        let leftlon = "0"
        let rightlon = "359"
        let toplat = "90"
        let bottomlat = "-90"

        // Address part with coordinates
        let address5 = "&leftlon="+leftlon+"&rightlon="+rightlon+"&toplat="+toplat+"&bottomlat="+bottomlat

        // Vector that includes all Grib files available for download
        let listOfFiles = readWebToString()

        if (!listOfFiles.isEmpty)
        {
            for i in 0..<listOfFiles.count
            {
                // Part of the link that includes the file
                let address6 = "&dir=%2F"+listOfFiles[i]

                // Extract time: last 2 characters
                let address3 = listOfFiles[i].substring(from:listOfFiles[i].index(listOfFiles[i].endIndex, offsetBy: -2))

                // Make the link
                let addressFull = (address1 + address2 + address3 + address4 + address5 + address6).trimmingCharacters(in: .whitespacesAndNewlines)

                finalResult.append(addressFull)
            }
        }

        return finalResult;
    }


func readWebToString() -> [String]
    {
        // Final array to return
        var finalResult = [String]()

        guard let dataURL = NSURL(string: self.GRIB_URL)
            else
        {
            print("IGAGribReader error: No URL identified")
            return []
        }

        do
        {
            // Get contents of the page
            let contents = try String(contentsOf: dataURL as URL)

            // Regular expression
            let expression : String = ">gfs\.\d+<"
            let range = NSRange(location: 0, length: contents.characters.count)

            do
            {
                // Match the URL content with regex expression
                let regex = try NSRegularExpression(pattern: expression, options: NSRegularExpression.Options.caseInsensitive)
                let contentsNS = contents as NSString
                let matches = regex.matches(in: contents, options: [], range: range)

                for match in matches
                {
                    for i in 0..<match.numberOfRanges
                    {
                        let resultingNS = contentsNS.substring(with: (match.rangeAt(i))) as String
                        finalResult.append(resultingNS)
                    }
                }

                // Remove "<" and ">" from the strings
                if (!finalResult.isEmpty)
                {
                    for i in 0..<finalResult.count
                    {
                        finalResult[i].remove(at: finalResult[i].startIndex)
                        finalResult[i].remove(at: finalResult[i].index(before: finalResult[i].endIndex))
                    }
                }
            }
            catch
            {
                print("IGAGribReader error: No regex match")
            }

        }
        catch
        {
            print("IGAGribReader error: URL content is not read")
        }


        return finalResult;
    }

过去几周我一直在尝试修复它,但没有成功。任何帮助将不胜感激!

        let contents = try String(contentsOf: dataURL as URL)

您正在主线程(主队列)上调用 String(contentsOf: url)。这会将URL的内容同步下载成一个字符串 主线程用来驱动UI,运行宁同步网络代码将冻结UI。 This is a big no-no.

你不应该在主队列中调用 readWebToString()。执行 DispatchQueue.main.async { self.downloadWindsAloftData() } 恰好将块放入我们应该避免的主队列中。 (async就表示"execute this later",还是在Dispatch.main上执行。)

你应该 运行 downloadWindsAloftData 在全局队列而不是主队列

    DispatchQueue.global(qos: .default).async {
        self.downloadWindsAloftData()
    }

只有 运行 DispatchQueue.main.async 当你想更新 UI.

您的堆栈跟踪告诉您它在 String(contentsOf:) 处停止,由 readWebToString 调用,由 makeGribWebAddress 调用。

问题是 String(contentsOf:) 执行同步网络请求。如果该请求需要任何时间,它将阻塞该线程。如果您从主线程调用它,您的应用程序可能会冻结。

理论上,您可以将该进程分派到后台队列,但这仅仅隐藏了更深层次的问题,即您正在使用 API 同步、不可取消并提供的网络请求没有有意义的错误报告。

您真的应该像在其他地方一样使用 URLSession 进行异步请求。避免将 String(contentsOf:) 与远程 URL 一起使用。