Search Results

Search found 16396 results on 656 pages for 'browser extensions'.

Page 585/656 | < Previous Page | 581 582 583 584 585 586 587 588 589 590 591 592  | Next Page >

  • Anchor tags are blank

    - by ryanday
    I'm having a problem where the my anchor tags sometimes aren't displaying their links. This is happening on Mobile Safari on multiple iPhones, and in the iPhone simulator. I'm using jQtouch r147, PhoneGap, and jQuery 1.4.2. I'm generating the data from a database call, and adding anchor tags to a list like this: for(var i=0;i<data.rows.length;i++) { var item = $('<li></li>'); var name = data.rows.item(i).name; var anchor = $('<a href="#lpage">'+name+'</a>'); item.addClass('arrow'); // This line always displays the name, even when I can't see // the name in the browser debug.log('The name: ' + name); (function(info) { anchor.bind('tap', function(e) { debug.log('Touch start ' + info.id); }); })(data.rows.item(i)); item.append(anchor); if( anchor.html() == null ) { debug.log('html is blank'); } $('#myUL').append(item); } Sometimes my list of names shows fine(http://imagebin.org/101462), and sometimes it is just blank(http://imagebin.org/101464). When the list is blank, I see the debug.log() line show me 'html is blank', and I also see the log line show me that the variable 'name' does, in fact, contain a valid name. When I check for anchor.html() == null, I've also tried to .remove() the anchor tag, and re-create it. But it always comes back without the name displayed. This happens on the mobile device and in the simulator, but I've never seen it happen in Safari or in Chrome. Has anyone seen something like this? I can't find the cause, and I can't get it to stop. Thank you for any ideas or suggestions!

    Read the article

  • How to Open Multiple PopUp windows with JQuery on form submission (popUps should be relative to form

    - by Ole Jak
    I have A HTML form like this: <form> <fieldset class="ui-widget-content ui-corner-all"> <select id="Streams" class="multiselect ui-widget-content ui-corner-all" multiple="multiple" name="Streams[]"> <option value="35"> Example Name (35)</option> <option value="44"> Example Name (44)</option> <option value="5698"> Example Name (5698)</option> <option value="777"> Example Name (777)</option> <option value="12"> Example Name (12)</option> </select> <input type="submit" class="ui-state-default ui-corner-all" name="submitForm" id="submitForm" value="Play Stream from selected URL's!"/> </fieldset> </form> in my JS + HTML page I use JQuery. As you can see I allow user to select multiple Items. I want to open on Submit button click as many popup windows as many Items he selected in a list. Each popUp window should open some url like www.example.com/test.php?id=OPTION_SELECTED . And here I mean by PopUp Window a real browser window. So for each of the selected options I ll get a pop up window whith diferent url. Please Help.

    Read the article

  • ExpertPDF and Caching of URLs

    - by Josh
    We are using ExpertPDF to take URLs and turn them into PDFs. Everything we do is through memory, so we build up the request and then read the stream into ExpertPDF and then write the bits to file. All the files we have been requesting so far are just plain HTML documents. Our designers update CSS files or change the HTML and rerequest the documents as PDFs, but often times, things are getting cached. Take, for example, if I rename the only CSS file and view the HTML page through a web browser, the page looks broke because the CSS doesn't exist. But if I request that page through the PDF Generator, it still looks ok, which means somewhere the CSS is cached. Here's the relevant PDF creation code: // Create a request HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url); request.UserAgent = "IE 8.0"; request.ContentType = "application/x-www-form-urlencoded"; request.Method = "GET"; // Send the request HttpWebResponse resp = (HttpWebResponse)request.GetResponse(); if (resp.IsFromCache) { System.Web.HttpContext.Current.Trace.Write("FROM THE CACHE!!!"); } else { System.Web.HttpContext.Current.Trace.Write("not from cache"); } // Read the response pdf.SavePdfFromHtmlStream(resp.GetResponseStream(), System.Text.Encoding.UTF8, "Output.pdf"); When I check the trace file, nothing is being loaded from cache. I checked the IIS log file and found a 200 response coming from the request, even after a file had been updated (I would expect a 302). We've tried putting the No-Cache attribute on all HTML pages, but still no luck. I even turned off all caching at the IIS level. Is there anything in ExpertPDF that might be caching somewhere or something I can do to the request object to do a hard refresh of all resources? UPDATE I put ?foo at the end of my style href links and this updates the CSS everytime. Is there a setting someplace that can prevent stylesheets from being cached so I don't have to do this inelegant solution?

    Read the article

  • ajax html vs xml/json responses - perfomance or other reasons

    - by pedalpete
    I've got a fairly ajax heavy site and some 3k html formatted pages are inserted into the DOM from ajax requests. What I have been doing is taking the html responses and just inserting the whole thing using jQuery. My other option is to output in xml (or possibly json) and then parse the document and insert it into the page. I've noticed it seems that most larger site do things the json/xml way. Google Mail returns xml rather than formatted html. Is this due to performance? or is there another reason to use xml/json vs just retrieving html? From a javascript standpoint, it would seem injecting direct html is simplest. In jQuery I just do this jQuery.ajax({ type: "POST", url: "getpage.php", data: requestData, success: function(response){ jQuery('div#putItHear').html(response); } with an xml/json response I would have to do jQuery.ajax({ type: "POST", url: "getpage.php", data: requestData, success: function(xml){ $("message",xml).each(function(id) { message = $("message",xml).get(id); $("#messagewindow").prepend(""+$("author",message).text()+ ": "+$("text",message).text()+ ""); }); } }); clearly not as efficient from a code standpoint, and I can't expect that it is better browser performance, so why do things the second way?

    Read the article

  • Login in via curl then open that page logged in

    - by user207022
    I'm trying the following code to send post data to the login form, then reload that page in the browser as a logged in user. somehow it's not saving the cookie, and reusing it for the header() function, can the same thing as header be done by calling curl again after sending the login details? .. $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,$url); curl_setopt($ch, CURLOPT_FOLLOWLOCATION,true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_HEADER, true); curl_setopt($ch, CURLOPT_TIMEOUT, 60); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER , false ); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST , false ); curl_setopt($ch, CURLOPT_USERAGENT, $defined_vars['HTTP_USER_AGENT']); //curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie); curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie); curl_setopt($ch, CURLOPT_MAXREDIRS, 1); // Apply the XML to our curl call curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $data); $data = curl_exec($ch); setcookie($cookie); header('location: ' . $url); die();

    Read the article

  • How do you detect when the Flowplayer has started playing the video?

    - by Alex
    Hi. I'm using the Flowplayer Flash video player to play MP4 videos inside an AnythingSlider. I need to detect when the user has clicked the start button on the video so to stop the slideshow and allow the user to view the video. I've tried using this code just to get an alert box but it doesn't do anything (the code is at the end of the page, before the closing </body> tag): <script type="text/javascript"> $f("player","global/js/flowplayer/flowplayer-3.1.5.swf",{ onStart: function(clip) { alert('player started'); } }); </script> The Flowplayer is implemented as part of the html5media library, which automatically detects whether the browser can handle the <video> tag; if not, it replaces the anchor tag below with the Flowplayer: <video id="vid" width="372" height="209" poster="global/vid/bbq-poster.jpg" controls preload> <source id="videoSource" src="global/vid/BBQ_Festival.mp4" type='video/mp4; codecs="avc1.42E01E, mp4a.40.2"' onerror="fallback(this.parentNode)" ></source> <a href="global/vid/BBQ_Festival.mp4" style="display:block;width:372px;height:209px; padding-left:5px; " id="player"> </a> </video> So the $f() function call is in a JavaScript block at the end of the page, to give the player a chance to load correctly. What am I doing wrong? Thanks.

    Read the article

  • autologin component doesn't work on remote server

    - by user606521
    I am using autologin component from http://milesj.me/code/cakephp/auto-login (v 3.5.1). It works on my localhost WAMP server but fails on remote server. I am using this settings in beforeFilter() callback: $this->AutoLogin->settings = array( // Model settings 'model' => 'User', 'username' => 'username', 'password' => 'password', // Controller settings 'plugin' => '', 'controller' => 'users', // Cookie settings 'cookieName' => 'rememberMe', 'expires' => '+1 month', // Process logic 'active' => true, 'redirect' => true, 'requirePrompt' => true ); On remote server it simply doesn't autolog users after the browser was closed. I can't figure out what may cause the problem. -------------------- edit I figured out what is causing the problem but I don't know how to fix this. First of all cookie is set like this: $this->Cookie->write('key',array('username' => 'someusername', 'hash' => 'somehash', ...) ); Then it's readed like this: $cookie = $this->Cookie->read('key'); On my WAMP server $cookie is array('username' => 'someusername', 'hash' => 'somehash', ...) and on remote server returned $cookie is string(159) "{\"username\":\"YWxlay5iYXJzsdsdZXdza2ldssd21haWwuY29t\",\"password\":\"YWxlazc3ODEy\",\"hash\":\"aa15bffff9ca12cdcgfgb351d8bfg2f370bf458\",\"time\":1339923926}" and it should be: array( 'username' => "YWxlay5iYXJzsdsdZXdza2ldssd21haWwuY29t", 'password' => "YWxlazc3ODEy", ...) Why the retuned cookie is string not array?

    Read the article

  • How can I use JSONP to download client-side javascript objects?

    - by Alex Mcp
    I'm trying to get client-side javascript objects saved as a file locally. I'm not sure if this is possible. The basic architecture is this: Ping an external API to get back a JSON object Work client-side with that object, and eventually have a "download me" link This link sends the data to my server, which processes it and sends it back with a mime type application/json, which (should) prompt the user to download the file locally. Right now here are my pieces: Server Side Code <?php $data = array('zero', 'one', 'two', 'testing the encoding'); $json = json_encode($data); //$json = json_encode($_GET['']); //eventually I'll encode their data, but I'm testing header("Content-type: application/json"); header('Content-Disposition: attachment; filename="backup.json"'); echo $_GET['callback'] . ' (' . $json . ');'; ?> Relevant Client Side Code $("#download").click(function(){ var json = JSON.stringify(collection); //serializes their object $.ajax({ type: "GET", url: "http://www.myURL.com/api.php?callback=?", //this is the above script dataType: "jsonp", contentType: 'jsonp', data: json, success: function(data){ console.log( "Data Received: " + data[3] ); } }); return false; }); Right now when I visit the api.php site with Firefox, it prompts a download of download.json and that results in this text file, as expected: (["zero","one","two","testing the encoding"]); And when I click #download to run the AJAX call, it logs in Firebug Data Received: testing the encoding which is almost what I'd expect. I'm receiving the JSON string and serializing it, which is great. I'm missing two things: The Actual Questions What do I need to do to get the same prompt-to-download behavior that I get when I visit the page in a browser (much simpler) How do I access, server-side, the json object being sent to the server to serialize it? I don't know what index it is in the GET array (silly, I know, but I've tried almost everything)

    Read the article

  • AjaxFileUpload return download panel when data is json

    - by Tr.Crab
    I use AjaxFileUpload (http://www.phpletter.com/Our-Projects/AjaxFileUpload/ ) to upload a file and get json result type response in struts2 ( code.google.struts2jsonresult.JSONResult ) but browser always pop-up download pane, plz give me some suggestions, thanks in advance Here is my config in struts.xml : ...... <result-type name="json" class="code.google.struts2jsonresult.JSONResult"> ............ <action name="doGetList" method="doGetList" class="main.java.GetListAction"> <result type="json"> <param name="target">jsonObject</param> <param name="deepSerialize">true</param> <param name="patterns"> -*.class</param> </result> </action> and js client : function ajaxFileUpload(){ $("#loading").ajaxStart(function(){ $(this).show(); }).ajaxComplete(function(){ $(this).hide(); }); $.ajaxFileUpload ( { url:'doGetList.do', secureuri:false, fileElementId:'uploadfile', dataType: 'json', success: function (data, status) { if(typeof(data.error) != 'undefined') { if(data.error != '') { alert(data.error); } else { alert(data.msg); } } }, error: function (data, status, e) { alert(e); alert(data.records); } } ) return false; }

    Read the article

  • Ruby on Rails Mongrel web server stuck when MySQL service is running

    - by Marcos Buarque
    Hi, I am a Ruby on Rails newbie and already have a problem. I have started the Mongrel web server and it works fine when MySQL service isn't running. But when MySQL is on, Mongrel stucks. It ceases from serving the pages. So far, I have tested the localhost:3000 URL. When MySQL is off, it serves the page. When I click "about application's environment", I get the messasge (of course) "Can't connect to MySQL server on 'localhost' (10061)". After starting the MySQL service and refreshing, I get no more answer and Mongrel does not serve the webpage. It gets stuck with no answer to the browser. Then I have to stop the webserver and restart it. I have installed mysql2 gem with the command gem install mysql2. I was able to create the _test and _development databases with the command line rake db:create. I have tested with MySQL root user and blank password and also tried with a superuser user I have created. No success. Here is the server log: ======================== Started GET "/rails/info/properties" for 127.0.0.1 at Fri Dec 24 17:41:25 -0200 2010 Mysql2::Error (Can't connect to MySQL server on 'localhost' (10061)): Rendered C:/Ruby187/lib/ruby/gems/1.8/gems/actionpack-3.0.3/lib/action_dispatch/middleware/templates/rescues/_trace.erb (1.0ms) Rendered C:/Ruby187/lib/ruby/gems/1.8/gems/actionpack-3.0.3/lib/action_dispatch/middleware/templates/rescues/_request_and_response.erb (5.0ms) Rendered C:/Ruby187/lib/ruby/gems/1.8/gems/actionpack-3.0.3/lib/action_dispatch/middleware/templates/rescues/diagnostics.erb within rescues/layout (35.0ms) ================= I am running on a Windows 7 environment with firewall down.

    Read the article

  • [grails] setting cookies when render type is "contentType: text/json"

    - by Robin Jamieson
    Is it possible to set cookies on response when the return render type is set as json? I can set cookies on the response object when returning with a standard render type and later on, I'm able to get it back on the subsequent request. However, if I were to set the cookies while rendering the return values as json, I can't seem to get back the cookie on the next request object. What's happening here? These two actions work as expected with 'basicForm' performing a regular form post to the action, 'withRegularSubmit', when the user clicks submit. // first action set the cookie and second action yields the originally set cookie def regularAction = { // using cookie plugin response.setCookie("username-regular", "regularCookieUser123",604800); return render(view: "basicForm"); } // called by form post def withRegularSubmit = { def myCookie = request.getCookie("username-regular"); // returns the value 'regularCookieUser123' return render(view: "resultView"); } When I switch to setting the cookie just before returning from the response with json, I don't get the cookie back with the post. The request starts by getting an html document that contains a form and when doc load event is fired, the following request is invoked via javascript with jQuery like this: var someUrl = "http://localhost/jsonAction"; $.get(someUrl, function(jsonData) { // do some work with javascript} The controller work: // this action is called initially and returns an html doc with a form. def loadJsonForm = { return render(view: "jsonForm"); } // called via javascript when the document load event is fired def jsonAction = { response.setCookie("username-json", "jsonCookieUser456",604800); // using cookie plugin return render(contentType:'text/json') { 'pair'('myKey': "someValue") }; } // called by form post def withJsonSubmit = { def myCookie = request.getCookie("username-json"); // got null value, expecting: jsonCookieUser456 return render(view: "resultView"); } The data is returned to the server as a result of the user pressing the 'submit' button and not through a script. Prior to the submit of both 'withRegularSubmit' and 'withJsonSubmit', I see the cookies stored in the browser (Firefox) so I know they reached the client.

    Read the article

  • Response.Redirect not firing due to code to prevent re-submission

    - by Marco
    I have an event which needs to contact some third party providers before performing a redirect (think 'final payment page on ecommerce site') and hence has some lag associated with its processing. It is very important that these third party providers are not contacted more than once, and sometimes impatient users may try and refresh the page (hence re-submitting the data). The general code structure is: If Session("orderStatus") <> 'processing' Then Session("orderStatus") = 'processing' DoThirdPartyStuffThatTakesSomeTime() Response.Redirect("confirmationPage.asp", True) End If The problem is, if the user refreshes the page, the response.redirect does not happen (even though the rest of the code will run before the redirect from the original submission). It seems that the new submission creates a new thread for the browser which takes precedence - it skips this bit of code obviously to prevent the third party providers being contacted a second time, and since there is no redirect, just comes back to the same page. The whole second submission may have completed before the first submission has finished its job. Any help on how I can still ignore all of the subsequent submissions of the page, but still make the redirect work...? Thanks

    Read the article

  • reload parent from within iframe

    - by Lauren
    I can't seem to reload the parent page from within an iframe... I've looked around at similar questions' answers but nothing has worked so far. The iframe I'm working with is here http://www.avaline.com/ R3000_3 once you log in, then hit the "order sample" button, and then hit "here" where it says "Your Third Party Shipper Numbers (To enter one, click here.)". I tried using javascript statements window.top.location.reload(),window.parent.location.reload(),window.parent.location.href=window.parent.location.href but none of those worked in FF 3.6 so I didn't move on to the other browsers although I am shooting for a cross-browser solution. I put the one-line javascript statements inside setTimeout("statement",2000) so people could read the content of the iframe (You have updated your shipper number(s). The page should refresh automatically. If not, please refresh and return to "order sample.") before the redirect happens, but that shouldn't affect the execution of the statements... I wish I could test and debug the statements with the Firebug console from within the Iframe but there doesn't seem to be any great way to test this out.

    Read the article

  • Rackspace Cloud rewrite jpg causes Session reset

    - by willoller
    This may be the .Net version of this question. I have an image script with the following: ... Response.WriteFile(filename); Response.End(); I am rewriting .jpg files using the following rewrite rule in web.config: <rule name="Image Redirect" stopProcessing="true"> <match url="^product-images/(.*).jpg" /> <conditions> <add input="{REQUEST_URI}" pattern="\.(jp?g|JP?G)$" /> </conditions> <action type="Redirect" redirectType="SeeOther" url="/product-images/ProductImage.aspx?path=product-images/{tolower:{R:1}}.jpg" /> </rule> It basically just rewrites the image path into a query parameter. The problem is that (intermittently of course) Mosso returns a new Asp Session cookie which breaks the whole world. Directly accessing a static .jpg file does not cause this problem. Directly accessing the image script does not cause it either. Only rewriting a .jpg file to the .aspx script causes the Session loss. Things I have tried (From the Rackspace doc How can I bypass the cache?) I added Private cacheability to the image script itself: Response.Cache.SetCacheability(HttpCacheability.Private); I tried adding these cache-disabling nodes to web.config: <staticContent> <clientCache cacheControlMode="DisableCache" /> </staticContent> and <httpProtocol> <customHeaders> <add name="Cache-Control private" value="Cache-Control private" </customHeaders> </httpProtocol> The Solution I need The browser cache cannot be disabled. This means potential solutions involving Cache.SetNoStore() or HttpCacheability.NoCache will not work.

    Read the article

  • How to tell what account my webservice is running under in Visual Studio 2005

    - by John Galt
    I'm going a little nuts trying to understand the doc on impersonation and delegation and the question has come up what account my webservice is running under. I am logged as myDomainName\johna on my development workstation called JOHNXP. From Vstudio2005 I start my webservice via Debug and the wsdl page comes up in my browser. From Task Manager, I see the following while sitting at a breakpoint in my .asmx code: aspnet_wp.exe pid=1316 UserName=ASPNET devenv.exe pid=3304 UserName=johna The IIS Directory Security tab for the Virtual Directory that hosts my ws.asmx code has "Enable Anonymous access" UNCHECKED and has "Integrated Windows Authentication" CHECKED. So when the MSDN people state "you must configure the user account under which the server process runs", what would they be refering to in the case of my little webservice described above? I am quoting from: http://msdn.microsoft.com/en-us/library/aa302400.aspx Ultimately, I want this webservice of mine to impersonate whatever authenticated domain user browses through to an invoke of my webservice. My webservice in turn consumes another ASMX webservice on a different server (but same domain). I need this remote webservice to use the impersonated domain user credentials (not those of my webservice on JOHNXP). So its getting a little snarly for me to understand this and I see I am unclear about the account my web service uses. I think it is ASPNET in IIS 5.1 on WinXP but not sure.

    Read the article

  • Events still not showing up despite the many tries...sigh

    - by sheng88
    I have tried the recommendation from this forum and through the many google searches...but i still can't get the events to show up on my calendar via jsp....i tried with php and it worked...sigh...i wonder where is the error....sigh.... The processRequest method is fine but when it dispatches to the JSP page...nothing appears from the browser.... protected void processRequest(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String email=request.getParameter("email"); try { ArrayList<CalendarEvt> calc=CalendarDAO.retrieveEvent(email); for (CalendarEvt calendarEvt : calc) { System.out.println(calendarEvt.getEventId()); } request.setAttribute("calendar", calc); request.getRequestDispatcher("calendar.jsp").forward(request, response); } catch (Exception e) { } } Here is the JSP section that's giving me headaches...(Without the loop...the Google link does appear...)...I have tried putting quotations and leaving them out....still no luck: <%--Load user's calendar--%> <script type='text/javascript'> $(document).ready(function() { var date = new Date(); var d = date.getDate(); var m = date.getMonth(); var y = date.getFullYear(); $('#calendar').fullCalendar({ editable: false, events: [ <c:forEach items="calendar" var="calc"> { title: '${calc.eventName}', start: ${calc.eventStart} }, </c:forEach> { title: 'Click for Google', start: new Date(y, m, 1), end: new Date(y, m, 1), url: 'http://google.com/' } ]//end of events }); }); </script> <%--Load user's calendar--%> ...any kind of help would be greatly appreciated...thx!!

    Read the article

  • .NET consumer of ActiveX throwing TargetParameterCountException

    - by DevSolo
    I have a .NET (3.5 w/ Dev Studio 2008) app that hosts a visual Active X (written in C++ w/ Dev Studio 2003). Have access to all sources, but can't easily move the Active X control up to 2008. This as worked fine in the past. Made some changes to the Active X control and now, when calling one method on the Active X, I'm getting a TargetParameterCountException 100% of the time. The signature of the Active X method is: LONG CMyActive::License(LPCTSTR string1, LPCTSTR string2, LONG long1, LPCTSTR string3, LPCTSTR string4); When viewing the method in object browser of reflector, .NET sees it as: public virtual int License(string string1, string string2, int long1, string string3, string string4) I renamed the parameters for demonstration purpose (boss gets twitchy about any code). I left the method name, as it could be relevant. There are method calls prior that work. I just can't seen to figure out why I'm all of a sudden getting this exception. The HRESULT is 0x8002000e and a quick search seems to indicate that's a general one. Thanks to all for reading.

    Read the article

  • How do I read a public twitter feed using .Net

    - by Jeff Weber
    I'm trying to read the public twitter status of a user so I can display it in my Windows Phone application. I'm using Scott Gu's example: http://weblogs.asp.net/scottgu/archive/2010/03/18/building-a-windows-phone-7-twitter-application-using-silverlight.aspx When my code comes back from the async call, I get a "System.Security.SecurityException" as soon as I try to use the e.Result. I know my uri is correct because I can plop it in the browser and get good results. Here is my relavent code: public void LoadNewsLine() { WebClient twitter = new WebClient(); twitter.DownloadStringCompleted += new DownloadStringCompletedEventHandler(twitter_DownloadStringCompleted); twitter.DownloadStringAsync(new Uri("http://api.twitter.com/1/statuses/user_timeline.xml?screen_name=krashlander")); } void twitter_DownloadStringCompleted(object sender, DownloadStringCompletedEventArgs e) { XElement xmlTweets = XElement.Parse(e.Result); //exception thrown here! var message = from tweet in xmlTweets.Descendants("status") select tweet.Element("text").Value; //Set message and tell UI to update. //NewsLine = message.ToString(); //RaisePropertyChanged("NewsLine"); } Any ideas anyone?

    Read the article

  • Custom Silverlight Splash Screen causes White Screen of Death

    - by wickster05
    I'm developing a Silverlight 4 application and I've created a custom splash screen. At first glance, the custom splash screen worked well - really well. After a few days I started noticing that the splash screen would no longer show and the screen would remain blank. This seems to only occur when I open multiple IE tabs/windows all pointing to this same application. The first few will load fine, while the following tabs/windows will remain "white" - as if nothing has/is loaded. This doesn't appear to be an issue with other browsers that I've tested with (i.e., Firefox and Chrome). Unfortunately, this product requires multiple screens to be open (and I'm not going to require our users to use a non-Microsoft internet browser). Furthermore, we have another product that hosts this Silverlight application inside a WPF WebBrowser control (which is similar to IE - and experiences the same issues as delineated above). Does anyone have any ideas on how to get around this? This is becoming increasingly frustrating. I should also point out, that the default splash screen seems to avoid these issues. When I remove the custom splash screen, we no longer see these problems. ANY help would be appreciated greatly! -Tom

    Read the article

  • What programming language is this?

    - by Richard M.
    I recently stumbled over a very odd source listing on a rather old programming-related site (lost it somewhere in my browser history as I didn't care about it at first). I think that this is part of a simple (console-based?) snake game. I searched and searched but didn't find a language that looked somwhat like this. This seems like a mix of Python, Ruby and C++. What the hell? What programming-language is the below source listing written in? Maybe you can figure it out? my Snake.hasProps { length parts xDir yDir } & hasMethods { init: length = 0 parts[0].x,y = 5 move: parts[ 0 ].x,y.!add xDir | yDir # Move the head map parts(i,v): parts[ i ] = parts[ i + 1 ] checkBiteSelf checkFeed checkBiteSelf: part } my SnakePart.hasProps { x y } fork SnakePart to !Feed my Game.hasProps { frameTime = 30 } & hasMethods { init: mainloop mainloop: sys.util.sleep frameTime Snake.move Field.getInput -> Snake.xDir | Snake.yDir Field.reDraw with Snake & Feed & Game # For FPS } main.isMethod { game.init }

    Read the article

  • Facebook offline access step-by-step

    - by Ben
    After searchinge litteraly 1 day on fb and google for an up-to-date and working way to do something seemingly simple: I am looking for a step-by-step explanation to get offline_access for a user for a facebook app and then using this (session key) to retrieve offline & not within a browser friends & profile data. Preferrably doing this in the Fb Java API. Thanks. And yes I did check the facebook wiki. Update: Anyone? this: http://www.facebook.com/authorize.php?api_key=<api-key>&v=1.0&ext_perm=offline_access gives me offline_Access, however how to retrieve the session_key? Why can't facebook just do simple documentation, I mean there are like 600 people working there? The seemingly same question: http://stackoverflow.com/questions/617043/getting-offlineaccess-to-work-with-facebook Does not answer how to retrieve the session key Edit: I am still stuck with that. I guess nobody really tried such a batch access out yet...

    Read the article

  • Server returns 500 error only when called by Java client using urlConnection/httpUrlConnection

    - by user455889
    Hi - I'm having a very strange problem. I'm trying to call a servlet (JSP) with an HTTP GET and a few parameters (http://mydomain.com/method?param1=test&param2=123). If I call it from the browser or via WGET in a bash session, it works fine. However, when I make the exact same call in a Java client using urlConnection or httpURLConnection, the server returns a 500 error. I've tried everything I have found online including: urlConn.setRequestProperty("Accept-Language", "en-us,en;q=0.5"); Nothing I've tried, however, has worked. Unfortunately, I don't have access to the server I'm calling so I can't see the logs. Here's the latest code: private String testURLConnection() { String ret = ""; String url = "http://localhost:8080/TestService/test"; String query = "param1=value1&param2=value2"; try { URLConnection connection = new URL(url + "?" + query).openConnection(); connection.setRequestProperty("Accept-Charset", "UTF-8"); connection.setRequestProperty("Accept-Language", "en-us,en;q=0.5"); BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(connection.getInputStream())); String line; StringBuilder content = new StringBuilder(); while ((line = bufferedReader.readLine()) != null) { content.append(line + "\n"); } bufferedReader.close(); metaRet = content.toString(); log.debug(methodName + " return = " + metaRet); } catch (Exception ex) { log.error("Exception: " + ex); log.error("stack trace: " + getStackTrace(ex)); } return metaRet; } Any help would be greatly appreciated!

    Read the article

  • Greasemonkey script not executed when unusual content loading is being used

    - by Sam Brightman
    I'm trying to write a Greasemonkey script for Facebook and having some trouble with the funky page/content loading that they do (I don't quite understand this - a lot of the links are actually just changing the GET, but I think they do some kind of server redirect to make the URL look the same to the browser too?). Essentially the only test required is putting a GM_log() on its own in the script. If you click around Facebook, even with facebook.com/* as the pattern, it is often not executed. Is there anything I can do, or is the idea of a "page load" fixed in Greasemonkey, and FB is "tricking" it into not running by using a single URL? If I try to do some basic content manipulation like this: GM.log("starting"); var GM_FB=new Object; GM_FB.birthdays = document.evaluate("//div[@class='UIUpcoming_Item']", document, null, XPathResult.UNORDERED_NODE_SNAPSHOT_TYPE, null); for (i = GM_FB.birthdays.snapshotLength - 1; i >= 0; i--) { if (GM_FB.birthdayRegex.test(GM_FB.birthdays.snapshotItem(i).innerHTML)) { GM_FB.birthdays.snapshotItem(i).setAttribute('style','font-weight: bold; background: #fffe88'); } } The result is that sometimes only a manual page refresh will make it work. Pulling up the Firebug console and forcing the code to run works fine. Note that this isn't due to late loading of certain parts of the DOM: I have adding some code later to wait for the relevant elements and, crucially, the message never gets logged for certain transitions. For example, when I switch from Messages to News Feed and back.

    Read the article

  • PHP Socket Server vs node.js: Web Chat

    - by Eliasdx
    I want to program a HTTP WebChat using long-held HTTP requests (Comet), ajax and websockets (depending on the browser used). Userdatabase is in mysql. Chat is written in PHP except maybe the chat stream itself which could also be written in javascript (node.js): I don't want to start a php process per user as there is no good way to send the chat messages between these php childs. So I thought about writing an own socket server in either PHP or node.js which should be able to handle more then 1000 connections (chat users). As a purely web developer (php) I'm not much familiar with sockets as I usually let web server care about connections. The chat messages won't be saved on disk nor in mysql but in RAM as an array or object for best speed. As far as I know there is no way to handle multiple connections at the same time in a single php process (socket server), however you can accept a great amount of socket connections and process them successive in a loop (read and write; incoming message - write to all socket connections). The problem is that there will most-likely be a lag with ~1000 users and mysql operations could slow the whole thing down which will then affect all users. My question is: Can node.js handle a socket server with better performance? Node.js is event-based but I'm not sure if it can process multiple events at the same time (wouldn't that need multi-threading?) or if there is just an event queue. With an event queue it would be just like php: process user after user. I could also spawn a php process per chat room (much less users) but afaik there are singlethreaded IRC servers which are also capable to handle thousands of users. (written in c++ or whatever) so maybe it's also possible in php. I would prefer PHP over Node.js because then the project would be php-only and not a mixture of programming languages. However if Node can process connections simultaneously I'd probably choose it.

    Read the article

  • VS2010 renders controls JS awkwardly

    - by Juergen Hoffmann
    I have created a Website Project in VS2010. My Controls are not rendered correctly. The JS that is produced is not correctly formatted. Here is an example: protected void Page_PreRender(object sender, EventArgs e) { if (!IsPostBack) { objListBox.Attributes.Add("onchange", "Control_doPostBack('" + objListBox.ClientID + "','ListBox_OnClick'); return false;"); objListBox.Attributes.Add("onblur", "Control_doPostBack('" + trListbox.ClientID + "','ListBox_OnBlur'); return false;"); img.Attributes.Add("onclick", "Control_doPostBack('" + trListbox.ClientID + "','IMG_OnClick'); return false;"); } } and the responding control is rendered as: <select size="4" name="ctl00$PlaceHolder_Content$drop$objListBox" onchange="Control_doPostBack(&#39;PlaceHolder_Content_drop_objListBox&#39;,&#39;ListBox_OnClick&#39;); return false;setTimeout(&#39;__doPostBack(\&#39;ctl00$PlaceHolder_Content$drop$objListBox\&#39;,\&#39;\&#39;)&#39;, 0)" id="PlaceHolder_Content_drop_objListBox" onblur="Control_doPostBack(&#39;PlaceHolder_Content_drop_trListbox&#39;,&#39;ListBox_OnBlur&#39;); return false;" style="position:absolute;"> </select> As you can see, the ' are rendered to &#39 which screwes up the Browser. Is there a tweak to msbuild or inside the project properties? Any help is highly appreciated.

    Read the article

< Previous Page | 581 582 583 584 585 586 587 588 589 590 591 592  | Next Page >