I have the following piece of code:
try{
            SAXParserFactory spf = SAXParserFactory.newInstance();
            SAXParser sp = spf.newSAXParser();
            /* Get the XMLReader of the SAXParser we created. */
            XMLReader r = sp.getXMLReader();
            //This handles the xml and populates the entries arr开发者_StackOverflow社区ay
            XMLHandler handler = new XMLHandler();
            // register event handlers
            r.setContentHandler(handler);
            String url = "http://news.library.ryerson.ca/api/isbnsearch.php?isbn="+ISBN;
            r.parse(url);
            return handler.getEntries();
        }
This code works fine most of the time, but there are several cases where a user can enter the isbn of a popular book with 100+ related ISBN's (such as harry potter for example). When that happens, the XML feed does not break, but it takes longer to load (can be up to 30+ seconds for extreme cases). When the page is loading, it never drops the connection, its just takes its time loading.
Is there a way to increase the timeout time for the function?
Thanks
//opens the URL as a stream, so it does not timeout prematurely
String u = new String("http://foobar/isbnsearch.php?isbn="+ISBN);
URL url = new URL(u);
InputStream stream = url.openStream();
r.parse(new InputSource(stream));
stream.close();
Solved this one myself by adding this in.
 
         
                                         
                                         
                                         
                                        ![Interactive visualization of a graph in python [closed]](https://www.devze.com/res/2023/04-10/09/92d32fe8c0d22fb96bd6f6e8b7d1f457.gif) 
                                         
                                         
                                         
                                         加载中,请稍侯......
 加载中,请稍侯......
      
精彩评论