Search Results

Search found 24162 results on 967 pages for 'jquery live'.

Page 349/967 | < Previous Page | 345 346 347 348 349 350 351 352 353 354 355 356  | Next Page >

  • Jquery loaded? Colorbox is available? Calling some js file via $.getScript are always downloading fr

    - by uzay95
    I am dynamically load js files with _tumIsler.js (_allStuff.js) <script src="../js/_tumJsler.js" type="text/javascript"></script> It contains: // url=> http: + // + localhost:4399 + /yabant/ // ---- ---- -------------- -------- // protocol + "//" + host + '/virtualDirectory/' var baseUrl = document.location.protocol + "//" + document.location.host + '/yabant/'; // If there is "~/" at the begining of url, replace it with baseUrl function ResolveUrl(url) { if (url.indexOf("~/") == 0) { url = baseUrl + url.substring(2); } return url; } // Attaching scripts to any tag function addJavascript(jsname, pos) { var th = document.getElementsByTagName(pos)[0]; var s = document.createElement('script'); s.setAttribute('type', 'text/javascript'); s.setAttribute('src', jsname); th.appendChild(s); } // I want to make sure jQuery is loaded? addJavascript(ResolveUrl('~/js/1_jquery-1.4.2.min.js'), 'head'); var loaded = false; // assume it didn't first and if it is change it to true function fControl() { // alert("JQUERY is loaded?"); if (typeof jQuery == 'undefined') { loaded = false; fTry2LoadJquery(); } else { loaded = true; fGetOtherScripts(); } } // Check is jQuery loaded fControl(); function fTry2LoadJquery() { // alert("JQUERY didn't load! Trying to reload..."); if (loaded == false) { setTimeout("fControl()", 1000); } else { return; } } function getJavascript(jsname, pos) { // I want to retrieve every script one by one $.ajaxSetup({ async: false, beforeSend: function() { $.ajaxSetup({ async: false, cache: true }); }, complete: function() { $.ajaxSetup({ async: false, cache: true }); }, success: function() { // } }); $.getScript(ResolveUrl(jsname), function() { /* ok! */ }); } function fGetOtherScripts() { // alert("Other js files will be load in this function"); getJavascript(ResolveUrl('~/js/5_json_parse.js'), 'head'); getJavascript(ResolveUrl('~/js/3_jquery.colorbox-min.js'), 'head'); getJavascript(ResolveUrl('~/js/4_AjaxErrorHandling.js'), 'head'); getJavascript(ResolveUrl('~/js/6_jsSiniflar.js'), 'head'); getJavascript(ResolveUrl('~/js/yabanYeni.js'), 'head'); getJavascript(ResolveUrl('~/js/7_ResimBul.js'), 'head'); getJavascript(ResolveUrl('~/js/8_HaberEkle.js'), 'head'); getJavascript(ResolveUrl('~/js/9_etiketIslemleri.js'), 'head'); getJavascript(ResolveUrl('~/js/bugun.js'), 'head'); getJavascript(ResolveUrl('~/js/yaban.js'), 'head'); getJavascript(ResolveUrl('~/embed/bitgravity/functions.js'), 'head'); } After all these js files are loaded, this line is executing to show UploadFile page inside the page when clicked to the button which id is "btnResimYukle" . <script type="text/javascript"> if (jQuery().colorbox) { alert("colorbox exists"); } else { alert("colorbox doesn't exist"); $.ajaxSetup({ cache: true, async: false }); $.getScript(ResolveUrl('~/js/3_jquery.colorbox-min.js'), function() { alert("Loaded ! "); $('#btnResimYukle').live('click', function() { $.fn.colorbox({ iframe: true, width: 700, height: 600, href: ResolveUrl('~/Yonetim/DosyaYukle.aspx') }); return false; }); }); } </script> First i want to ask you very good people, i am always calling js files with $.getScript function. Are they always downloading in every $.getScript requests? And if it is so, how can i prevent that? is this work: $.ajaxSetup({ cache: true, async: false }); Second, i am always getting this error when i press F5 or Ctrl+F5 : But when i press enter key on url, there is no error :s

    Read the article

  • jquery $.getJSON only works once in internet explorer Help Please!!!

    - by JasperS
    I have a php function which inserts a searchbar into each page on a website. The site checks to see if the user has javascript enabled and if they do it inserts some jquery ajax stuff to link select boxes (instead of using it's fallback onchange="form.submit()"). $.getJSON works perfectly for me in other browsers except in IE, if I do a full page refresh (ctrl+F5) in IE my ajax works flawlessly until I navigate to a new page (or the same page with $PHP_SELF) either by submiting the form or clicking a link the jquery onchange function fires but then jquery throws an error: Webpage error details Message: Object doesn't support this property or method Line: 123 Char: 183 Code: 0 URI: http://~#UNABLE~TO~DISCLOSE#~/jquery-1.4.2.min.js It seems like jquery function $.getJSON() is gone??? This seems to be some kind of caching issue as it happens on the second page load but I think i've go all the caching prevention in place anyways, here's a snipet of the code that ads the jquery functions: if (isset($_SESSION['NO_SCRIPT']) == true && $_SESSION['NO_SCRIPT'] == false) { $html .= '<script type="text/javascript" charset="utf-8">'; $html .= '$.ajaxSetup({ cache: false });'; $html .= '$.ajaxSetup({"error":function(XMLHttpRequest,textStatus, errorThrown) { alert(textStatus); alert(errorThrown); alert(XMLHttpRequest.responseText); }});'; $html .= '</script>'; $html .= '<script type="text/javascript" charset="utf-8">'; $html .= '$(function(){ $("select#searchtype").change(function() { '; $html .= 'alert("change fired!"); '; $html .= '$.getJSON("ajaxgetcategories.php", {id: $(this).val()}, function(j) { '; $html .= 'alert("ajax returned!"); '; $html .= 'var options = \'\'; '; $html .= 'options += \'<option value="0" >--\' + j[0].all + \'--</option>\'; '; $html .= 'for (var i = 0; i < j.length; i++) { options += \'<option value="\' + j[i].id + \'">\' + j[i].name + \'</option>\'; } '; $html .= '$("select#searchcategory").html(options); }) }) }) '; $html .= '</script> '; $html .= '<script type="text/javascript" charset="utf-8"> '; $html .= '$(function(){ $("select#searchregion").change(function() { '; $html .= 'alert("change fired!"); '; $html .= '$.getJSON("ajaxgetcountries.php", {id: $(this).val()}, function(j) { '; $html .= 'alert("ajax returned!"); '; $html .= 'var options = \'\'; '; $html .= 'options += \'<option value="0" >--\' + j[0].all + \'--</option>\'; '; $html .= 'for (var i = 0; i < j.length; i++) { options += \'<option value="\' + j[i].id + \'">\' + j[i].name + \'</option>\'; } '; $html .= '$("select#searchcountry").html(options); }) }) }) '; $html .= '</script> '; }; return $html; remember, this is part of a php funtion that inserts a script into the html and sorry if it looks a bit messy, I'm new to PHP and Javascript and I'm a bit untidy too :) Please also remember that this works perfectly in IE on the first visit but after any navigation I get the error. Thanks guys

    Read the article

  • Logging Into a site that uses Live.com authentication with C#

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. ![alt text][1] http://img391.imageshack.us/img391/6049/31394881.gif using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow. Cheers, --Josh

    Read the article

  • Logging Into a site that uses Live.com authentication

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow.

    Read the article

  • How to have member variables and public methods in a jQuery plugin?

    - by user169867
    I'm trying to create a jQuery plugin that will create something like an autoCompleteBox but with custom features. How do I store member variables for each matching jQuery element? For example I'll need to store a timerID for each. I'd also like to store references to some of the DOM elements that make up the control. I'd like to be able to make a public method that works something like: $("#myCtrl").autoCompleteEx.addItem("1"); But in the implementation of addItem() how can I access the member variables for that particular object like its timerID or whatever? Below is what I have so far... Thanks for any help or suggestions! (function($) { //Attach this new method to jQuery $.fn.autoCompleteEx = function(options) { //Merge Given Options W/ Defaults, But Don't Alter Either var opts = $.extend({}, $.fn.autoCompleteEx.defaults, options); //Iterate over the current set of matched elements return this.each(function() { var acx = $(this); //Get JQuery Version Of Element (Should Be Div) //Give Div Correct Class & Add <ul> w/ input item to it acx.addClass("autoCompleteEx"); acx.html("<ul><li class=\"input\"><input type=\"text\"/></li></ul>"); //Grab Input As JQ Object var input = $("input", acx); //Wireup Div acx.click(function() { input.focus().val( input.val() ); }); //Wireup Input input.keydown(function(e) { var kc = e.keyCode; if(kc == 13) //Enter { } else if(kc == 27) //Esc { } else { //Resize TextArea To Input var width = 50 + (_txtArea.val().length*10); _txtArea.css("width", width+"px"); } }); }); //End Each JQ Element }; //End autoCompleteEx() //Private Functions function junk() { }; //Public Functions $.fn.autoCompleteEx.addItem = function(id,txt) { var x = this; var y = 0; }; //Default Settings $.fn.autoCompleteEx.defaults = { minChars: 2, delay: 300, maxItems: 1 }; //End Of Closure })(jQuery);

    Read the article

  • Popup browser incompability

    - by Cornelis
    I have a popup with drop down menus on it. I've scaled it down and simplified it for test purposes, but it still doesn't work the way I want/it should. <!DOCTYPE html> <html> <head> <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js"></script> <script type="text/javascript"> jQuery(document).ready(function(){ jQuery('.trigger').click(function(){ var picker = jQuery('.popup'); jQuery('<div></div>').css({ height: screen.height, width: screen.width, position: 'absolute', 'z-index': -1, top: picker.offset().top*-1, left: picker.offset().left*-1, border: '1px solid red' }).click(function(){ picker.trigger('focusout'); jQuery(this).hide(); }).prependTo(picker); picker.css('visibility', 'visible'); }); jQuery('.popup').live("focusout", function() { jQuery('.popup').fadeTo(500, 0.0, function() { jQuery('.popup').css('visibility', 'hidden'); jQuery('.popup').css('opacity', '1.0'); }); }); }); </script> </head> <body> <p> <input type=text class=trigger /> <div id=popup-div class=popup style="visibility: hidden; border: 1px solid red"> <select> <option>option1</option> </select> <p>Popup text</p> </div> </p> </body> When you click on the input field, a 'popup' appears, if you click outside the red border it fades away. If you click on the select option it shouldn't dissappear! However on this point, Chrome doesn't work the same as IE/FF/Opera/Safari, and makes the div dissappear. (Using Chrome 4.0.295.0) Does anybody knows a work-around for Chrome? Calling event.stopPropagation() on select elements did not work so far

    Read the article

  • JQuery Cycle, how can I change from image to div?

    - by vick
    <!doctype html> <html> <head> <title>JQuery Cycle Plugin - Example Slideshow</title> <style type="text/css"> .slideshow { height: 232px; width: 232px; margin: auto } .slideshow img { padding: 15px; border: 1px solid #ccc; background-color: #eee; } </style> <!-- include jQuery library --> <script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.1/jquery.min.js"></script> <!-- include Cycle plugin --> <script type="text/javascript" src="http://cloud.github.com/downloads/malsup/cycle/jquery.cycle.all.2.74.js"></script> <!-- initialize the slideshow when the DOM is ready --> <script type="text/javascript"> $(document).ready(function() { $('.slideshow').cycle({ fx: 'shuffle' // choose your transition type, ex: fade, scrollUp, shuffle, etc... }); }); </script> </head> <body> <div class="slideshow"> <img src="http://cloud.github.com/downloads/malsup/cycle/beach1.jpg" width="200" height="200" /> <img src="http://cloud.github.com/downloads/malsup/cycle/beach2.jpg" width="200" height="200" /> <img src="http://cloud.github.com/downloads/malsup/cycle/beach3.jpg" width="200" height="200" /> </div> </body> </html> How can I make this exact scroller work with divs instead of img?? basically, I want to use <div> etc etc etc </div> instead of: <img src="http://cloud.github.com/downloads/malsup/cycle/beach3.jpg" width="200" height="200" />

    Read the article

  • Live search results as you type... am I going about this the right way? jQuery + PHP

    - by dallen
    This is my first time building a tool like this, so please bare with me. I'm doing this to learn more about jQuery and AJAX. Basically, I have a search input and a hidden div. When you start typing in the search input, the hidden div becomes visible and results are brought in. In this case, I'm searching for client names. It all works fine, however I think my code could be better but I'm not sure exactly where to begin. Each keyup requests a PHP script which accesses a table in a database to find a like string. But in my PHP script, I'm echo'ing some JS/jQuery which I'm not sure is good practice. Below is my code. Am I going about this the right way or am I totally off base? Any suggestions for improvement? Javascript $("#search").keyup(function() { $("#search_results").show("fast"); $.ajax ({ type: "POST", url: "http://localhost:8888/index.php/welcome/search/" + $("#search").val(), success: function(html) { $("#search_results").html(html); } }); }); PHP function search($search_string = false) { if ($search_string) { $this->db->like('name', $search_string); $query = $this->db->get('clients'); if ($query->num_rows() == 0) { echo "No client exists."; } else { foreach ($query->result() as $row) { echo '<script>'; echo ' $("#client_results_'.$row->id.'").hide(); $("#'.$row->id.'").toggle(function() { $.ajax ({ type: "POST", url: "http://localhost:8888/index.php/welcome/search_client_ads/" + '.$row->id.', success: function(html) { $("#client_results_'.$row->id.'").html(html).show("fast"); } }); }, function() { $("#client_results_'.$row->id.'").hide("fast").html(""); });'; echo '</script>'; echo '<p><span id="'.$row->id.'">'.$row->name.'</span></p>'; echo '<div id="client_results_'.$row->id.'"></div>'; } } } else { echo ''; } } function search_client_ads($client_id) { $query = $this->db->get_where('online_ads', array('client' => $client_id)); if ($query->num_rows() == 0) { echo "No ads exist."; } else { foreach ($query->result() as $row) { echo $row->id; } } }

    Read the article

  • Form value field is visable in localhost but not on live site-how to hide?

    - by Joel
    Hi guys, I'm almost completed moving my first live site to my new xampp setup on localhost. I have a form that uses jquery in the header of the site. It's a bit verbose, but here it is: <div class="outeremailcontainer"> <div id="emailcontainer"> <?php include('verify.php'); ?> <form action="index_success.php" method="post" id="sendEmail" class="email"> <h3 class="register2">Newsletter Signup:</h3> <ul class="forms email"> <li class="name"><label for="yourName">Name: </label> <input type="text" name="yourName" class="info" id="yourName" value="<?= $_POST['yourName']; ?>" /><br /> </li> <li class="city"><label for="yourCity">City: </label> <input type="text" name="yourCity" class="info" id="yourCity" value="<?= $_POST['yourCity']; ?>" /><br /> </li> <li class="email"><label for="emailFrom">Email: </label> <input type="text" name="emailFrom" class="info" id="emailFrom" value="<?= $_POST['emailFrom']; ?>" /> <?php if(isset($emailFromError)) echo '<span class="error">'.$emailFromError.'</span>'; ?> </li> <li class="buttons email"> <button type="submit" id="submit">Send</button> <input type="hidden" name="submitted" id="submitted" value="true" /> </li> </ul> </form> <div class="clearing"> </div> </div> </div> So I am using jQuery (I can include the function if need-be) and it hides fields, etc. The problem is that on the localhost site, the values of the fields are populating the fields. IE: first field has this in the box, etc <?= $_POST['yourName']; ?> It works great in the live site though. Any idea how to fix this? Thanks!

    Read the article

  • ASP.NET AJAX, jQuery and AJAX Control Toolkit&ndash;the roadmap

    - by Harish Ranganathan
    The opinions mentioned herein are solely mine and do not reflect those of my employer Wanted to post this for a long time but couldn’t.  I have been an ASP.NET Developer for quite sometime and have worked with version 1.1, 2.0, 3.5 as well as the latest 4.0. With ASP.NET 2.0 and Visual Studio 2005, came the era of AJAX and rich UI style web applications.  So, ASP.NET AJAX (codenamed “ATLAS”) was released almost an year later.  This was called as ASP.NET 2.0 AJAX Extensions.  This release was supported further with Visual Studio 2005 Service Pack 1. The initial release of ASP.NET AJAX had 3 components ASP.NET AJAX Library – Client library that is used internally by the server controls as well as scripts that can be used to write hand coded ajax style pages ASP.NET AJAX Extensions – Server controls i.e. ScriptManager,Proxy, UpdatePanel, UpdateProgress and Timer server controls.  Works pretty much like other server controls in terms of development and render client side behavior automatically AJAX Control Toolkit – Set of server controls that extend a behavior or a capability.  Ex.- AutoCompleteExtender The AJAX Control Toolkit was a separate download from CodePlex while the first two get installed when you install ASP.NET AJAX Extensions. With Visual Studio 2008, ASP.NET AJAX made its way into the runtime.  So one doesn’t need to separately install the AJAX Extensions.  However, the AJAX Control Toolkit still remained as a community project that can be downloaded from CodePlex.  By then, the toolkit had close to 30 controls. So, the approach was clear viz., client side programming using ASP.NET AJAX Library and server side model using built-in controls (UpdatePanel) and/or AJAX Control Toolkit. However, with Visual Studio 2008 Service Pack 1, we also added support for the ever increasing popular jQuery library.  That is, you can use jQuery along with ASP.NET and would also get intellisense for jQuery in Visual Studio 2008. Some of you who have played with Visual Studio 2010 Beta and .NET Framework 4 Beta, would also have explored the new AJAX Library which had a lot of templates, live bindings etc.,  But, overall, the road map ahead makes it much simplified. For client side programming using JavaScript for implementing AJAX in ASP.NET, the recommendation is to use jQuery which will be shipped along with Visual Studio and provides intellisense as well. For server side programming one you can use the server controls like UpdatePanel etc., and also the AJAX Control Toolkit which has close to 40 controls now.  The AJAX Control Toolkit still remains as a separate download at CodePlex.  You can download the different versions for different versions of ASP.NET at http://ajaxcontroltoolkit.codeplex.com/ The Microsoft AJAX Library will still be available through the CDN (Content Delivery Network) channels.  You can view the CDN resources at http://www.asp.net/ajaxlibrary/CDN.ashx Similarly even jQuery and the toolkit would be available as CDN resources in case you chose not to download and have them as a part of your application. I think this makes AJAX development pretty simple.  Earlier, having Microsoft AJAX Library as well as jQuery for client side scripting was kind of confusing on which one to use.  With this roadmap, it makes it simple and clear. You can read more on this at http://ajax.asp.net I hope this post provided some clarity on the AJAX roadmap as I could decipher from various product teams. Cheers!!!

    Read the article

  • ASP.NET Connections Fall 2011 Slides and Code

    - by Stephen Walther
    Thanks everyone who came to my talks at ASP.NET Connections in Las Vegas!  There was a definite theme to my talks this year…taking advantage of JavaScript to build a rich presentation layer. I gave the following three talks: JsRender Templates – Originally, I was scheduled to give a talk on jQuery Templates.  However, jQuery Templates has been deprecated and JsRender is the new technology which replaces jQuery Templates. In the talk, I give plenty of code samples of using JsRender.  You can download the slides and code samples RIGHT HERE   HTML5 – In this talk, I focused on the features of HTML5 which are the most interesting to developers building database-driven Web applications. In particular, I discussed Web Sockets,  Web workers, Web Storage, Indexed DB, and the Offline Application Cache. All of these features are supported by Safari, Chrome, and Firefox today and they will be supported by Internet Explorer 10. You can download the slides and code samples RIGHT HERE   Ajax Control Toolkit – My company, Superexpert, is responsible for developing and maintaining the Ajax Control Toolkit. In this talk, I discuss all of the bug fixes and new features which the developers on the Superexpert team have added to the Ajax Control Toolkit over the previous six months. We also had a good discussion of the features which people want in future releases of the Ajax Control Toolkit. The slides and code samples for this talk can be downloaded RIGHT HERE   I had a great time in Las Vegas!  Good questions, friendly audience, and lots of opportunities for me to learn new things!      -- Stephen

    Read the article

  • When I try to click and launch some of the links set to open in new window, it is being treated as a pop-up window [migrated]

    - by Test Developer
    For the past few days, we are facing issue with the chrome browser behavior. This is related to opening links set to open in new tab/window. The details are as follow: I have a collection of links and each link points to different resource to be opened in a new tab/window. The code is as follow: <a class="cssClass" rid="1114931" href="http://www.domain.com/resources/abc.html" title="Link1" tabindex="4">Link 1</a> And there are few checks/filters over accessing the resources which have been implemented as onClick handler over the links. In case any of the validations fails, the onClick handler returns false and the default behavior of the link does not happens i.e. links does not get open. One of such (last) checks includes AJAX call in sync mode. The code is as follow: var link_clickHandler = function(evt/* Event */) { var objTarget = jQuery(evt.target); if(check1) { return false; } else if(check2) { return false; } else if(check3) { var blnRetVal = false; jQuery.ajax( { "async" : false, "type" : "GET", "contentType" : "application/json; charset=utf-8", "url" : "index.php", "data" : 'resourceid=' + intResourceId, "dataType" : "json", "forceData" : true, "success" : function(data) { if(check1) { blnRetVal = true; } } "error" : function(error) { } } ); return blnRetVal; } }; jQuery("a.cssClass").live("click", link_clickHandler); ISSUE: The issue is that Chrome is behaving very weirdly manner. In case all of the checks are passed and onClick handler returns true, sometimes the resource get opened in a new tab/window and sometimes it get opened as a pop-up (which should never). Tried to capture any pattern but could not succeed. Any solution or even helping in understanding behavior would be really appreciated.

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • Remove Bottom panel from LiveCD Install

    - by Uri Herrera
    How can I remove or autohide to 0 the bottom Gnome panel from a live CD and autostart AWN to replace it? So far i kinda, found some commands to autohide to 0 the bootm panel gconftool-2 --type bool --set /apps/panel/default_setup/toplevels/bottom_panel/auto_hide 1 gconftool-2 --set /apps/panel/default_setup/toplevels/bottom_panel/auto_hide --type bool 1 gconftool-2 -t bool -s /apps/panel/default_setup/toplevels/bottom_panel/auto_hide true but i don't know which one will do what i want.

    Read the article

  • Install Ubuntu on USB + Disk Encryption

    - by snipey
    I want to create an Operating System installed upon a USB instead HDD (4 GB) So I wanted to know if there were any special steps for it or simply choosing usb in installation menu. P.S - I want to do full install and not live boot. And After that I want to encrypt the entire Operating System using TrueCypt , guide already present on their website , I just wanted to know if it would be compatible with this method. THanks :)

    Read the article

  • Should I install ubuntu on USB instead of HDD dual-boot?

    - by user2147243
    I had Ubuntu 12.04 installed as dual-boot OS on top of Vista on my laptop. Hacked the grub settings to default to Vista (instead of the default Ubuntu -- pain) on startup, and all was OK for occasional Ubuntu use for past 6 months. Then last week I got a strange message about 'lack of disk space' (~50MB free) when installing pxyplot, even though there was still about 6GB free disk space when I checked later. Then today the Ubuntu wouldn't load at all, and checking the HDD partitions in Vista it looked like the 15GB Ubuntu partition was now three smaller partitions! So, I got rid of those partitions and expanded the Vista partition to use the reclaimed space. Now can't restart ('grub rescue' appears and doesn't 'rescue' anything), so I'll have to do a boot recovery using a Vista installation CD. (Not a particularly user-friendly failure mode of the dual-boot installation!) I now have to decide to either a) try installing ubuntu on the HDD again, but don't want to stuff up my Vista ever again, as that is my most used OS, or b) install Ubuntu on a 16GB USB 3.0 stick. Apparently performance from USB won't be as good as from HDD, and running OS from USB stick does lots of r/w so the stick may fail after a few years! Perhaps installing Ubuntu on live USB and setup to then run in RAM would alleviate the performance/USB lifespan problems? If I create a live-USB for Ubuntu OS, will it boot off that when I restart the laptop with it plugged in? Or will I have to change the laptop setting for boot-order whenever I want to boot Ubuntu instead of Vista (that would be even more painful than the grub default boot order putting Ubuntu ahead of the existing Vista OS!) -- update: I recovered my Vista setup using Iolo SystemMechanic Disaster Recovery Tool, and created a bootable USB of Ubuntu 13.10 on an 8GB USB3.0 pendrive, with 4GB of 'persistence' to allow saving of settings, install some packages etc. It worked OK for a couple of test boots, but once I changed the time and desktop wallpaper, the next Ubuntu reboot crashed and I then couldn't get it to boot successfully. So I decided to install Ubuntu 12.04 LTS as a dual-boot again, but this time instead of partitioning the HDD and installing from an ISO DVD I used the wubi.exe tool to install Ubuntu as a dual-boot. Worked very well, although one oddity was that, despite asking how big the make the partition (20GB), the installed Ubuntu appears to be happily installed somewhere within the Vista NTFS file system (no partition shows up in Windows disk manager, and in Ubuntu disk management tool the entire 133 GB of HDD is showing, with ~40GB free space). A nice feature of installing the dual-boot using wubi is that the laptop now uses Windows boot manager on startup, with Vista as the default OS and Ubuntu happily listed as second on the list. So far so good.

    Read the article

  • Can't install alternate CD from USB?

    - by mattias
    Hi im trying to install ubuntu 12.04 with full hard disk encryption. After downloading and installing the Ubuntu live CD, I learned that truecrypt doesnt support full disk encryption on linux. I also learned that the best way to get "nearly full disk encryption" on ubuntu is by installing it from the alternate install CD. I tried that, but something is wrong with my CD reader/burner so it doesnt boot up when i insert the cd. My thought here was to take the .iso that I downloaded on my unencrypted Ubuntu system, use Unetbootin to make the usb drive. The usb drive used for this is exactly the same brand as one that I know has worked with a previous ubuntu live system on the same computer. I also used unetbootin for that usb, but I created it from windows that time. The usb stick boots up fine and i get through the first couple of steps in the installation process. However, After a while I get a "box" with the following error message "Load Installer components from CD" There was a problem reading data from the CD-ROM. Please make sure it is in the drive. If retrying does not work, you should check the integrity of your CD-ROM. "Failed to copy file from CD-ROM. Retry? " Then I cant get any further. I googled a lot and found this page which seems to tackle this very problem: http://www.dotkam.com/2010/11/29/ins...mage-from-usb/ I tried to do what it said. After pressing TAB, I wrote : cdrom-detect/try-usb=true without quotes because that's what i think is right. When I press TAB, there already is a text saying : /ubnkern initrd=/ubninit vga=788 -- quiet which can be removed. I have tried to both delete the text before the "--" and just inserting cdrom-detect/try-usb=true before it. Any idea of what can be wrong? I would like to do a full system encryption, or as full as it is possible. I dont want to just encrypt my /home folder. Maybe this isn't the easiest way. I use SanDisk usb sticks. I know there is a problem with U3 launcher on some SanDisks, but I never had to remove U3 before from similar disks, and the alternate install does boot up, so I dont think using U3 removal would help me. Any help or indication to an easier way to do this would be appreciated

    Read the article

  • how to get this row count for jquery grid..

    - by kumar
    I used this code to get the count of records in the jquery grid var numRows = jQuery("#mygrid").jqGrid ('getGridParam', 'records'); when i place anywhere in my view after or before grid?? am allways geting 0 result.. bec its allways taking before grid loading.. i need to place this code where i need to check after grid loading.. if i put something like this. alert("hello"); var numRows = jQuery("#mygrid").jqGrid ('getGridParam', 'records'); alert(numRows); first if i keep any alert message and then if i count i am getting the number of records.. but if i give directly this code var numRows = jQuery("#mygrid").jqGrid ('getGridParam', 'records'); alert(numRows); i am getting out put as 0.. i dont know why its behaving like this.. if we keep first alert box anywhere for second alert box i am getting rowcounts.. can anybody help me out .. thanks

    Read the article

  • jquery click on anchor element forces scroll to top?

    - by Dan.StackOverflow
    http://stackoverflow.com/questions/720970/jquery-hyperlinks-href-value[link text][1] I am running in to a problem using jquery and a click event attached to an anchor element. [1]: http://stackoverflow.com/questions/720970/jquery-hyperlinks-href-value "This" SO question seems to be a duplicate, and the accepted answer doesn't seem to solve the problem. Sorry if this is bad SO etiquette. In my .ready() function I have: jQuery("#id_of_anchor").click(function(event) { //start function when any update link is clicked Function_that_does_ajax(); }); and my anchor looks like this: <a href="#" id="id_of_anchor"> link text </a> but when the link is clicked, the ajax function is performed as desired, but the browser scrolls to the top of the page. not good. I've tried adding: event.preventDefault(); before calling my function that does the ajax, but that doesn't help. What am I missing? Clarification I've used every combination of return false; event.preventDefault(); event.stopPropagation(); before and after my call to my js ajax function. It still scrolls to the top.

    Read the article

  • Does jQuery strip some html elements from a string when using .html()?

    - by Nic Hubbard
    I have a var that contains a full html page, including the head, html, body, etc. When I pass that string into the .html() function, jQuery strips out all those elements, such as body, html, head, etc, which I don't want. My data var contains: <html> <head> <title>Untitled Document</title> </head> <body> </body> </html> // data is a full html document string data = $('<div/>').html(data); // jQuery stips my document string! alert(data.find('head').html()); I am needing to manipulate a full html page string, so that I can return what is in the element. I would like to do this with jQuery, but it seems all of the methods, append(), prepend() and html() all try to convert the string to dom elements, which remove all the other parts of a full html page. Is there another way that I could do this? I would be fine using another method. My final goal is to find certain elements inside my string, so I figured jQuery would be best, since I am so used to it. But, if it is going to trim and remove parts of my string, I am going to have to look for another method. Ideas?

    Read the article

  • Chrome is leaking memory, when jQuery is used on events?

    - by user269386
    Hi, I'm experiencing an increase of memory usage, when I use the jQuery-eventhandling in Chrome. I've tested it with IE and FF as well, but there I couldn't see a suspicious rise of memory-usage, compared to Chrome. I'm using Chrome version 4.0.223.16 (unfortunately I'm forced to use this version, here) Simple example here. Just scroll with the mousewheel in the red box and open the Chrome-taskmanager and you will see an increase of memory which won't be released anymore: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head> <script type="text/javascript" src="js/libs/jquery-1.4.2.min.js"></script> <script type="text/javascript"> jQuery(function () { jQuery("#div1").bind("mousewheel", function (event) { event.preventDefault(); }); }); </script> </head> <body> <div id="div1" style="width: 100px; height: 100px; background-color: red;"></div> </body> </html> Does anyone have experienced the same problem (or is it maybe solvend with a different version of chrome)? And does anyone have a fix for it? thanks

    Read the article

  • How do I use a jQuery not selector to select relative URLs?

    - by Matt
    I'm working on a little jQuery script to add Google Analytics pageTracker onclick data to all relative URLs on my forum, allowing me to track clicks to external sites. I don't want to add the onclick to internal links on forum.sitename or sitename, and I don't want to add them to any hrefs marked # or that start with /. My script below works nicely, but for one minor problem! All of the forum's URLs are relative and don't start with /. I appear to have no way to change that, so need to modify the jQuery below to prevent it adding the onclick to links like as it currently does. What I want to do, is to write a .not() function like .not("[href!^=http") to prevent jQuery from adding the onclick to any hrefs which do not start with http. However, .not() appears not to support this. I'm new to jQuery and can't figure this out. Any pointers would be massively appreciated. $(document).ready(function(){ // Get URL from a href var URL = $("a").attr('href'); // Add pageTracker data for GA tracking $("a") .not("[href^=#]") .not("[href^=http://forum.sitename]") .not("[href^=http://www.sitename]") .attr("onclick","pageTracker._trackEvent('Outgoing_Links', 'Forum', " + URL + ");") ; }); Thanks!

    Read the article

  • Is a full html page needed when loading a page with jQuery mobile?

    - by Vincent Hiribarren
    I am currently looking at jQuery mobile and its system of loading web pages with XmlHttpRequest. Thanks to that it is possible to automatically perform transition animations between two pages, for instance. However, something is not clear to me. If I understand correctly, each new page of a jQuery mobile powered website is injected in the DOM of the initial web page. The documentation of jQuery mobile even tells that because of this mechanism, the <title> tag of new webpages are not taken into account. So, in a way, if my initial webpage A.html loads a page B.html, I would tend to think that the webpage B.html does not need to have a full HTML grammar with the <html>, <head> or <body> tags. My page B.html could directly begin with a <div> element. Am I right?Is a full html page needed when loading a HTML page with jQuery mobile?What are the pros and cons about having a webpage with a wrong/truncated HTML syntax (appart that this page should not be accessed directly but through the main page)?

    Read the article

  • Best way to use Google's hosted jQuery, but fall back to my hosted library on Google fail

    - by Nosredna
    What would be a good way to attempt to load the hosted jQuery at Google (or other Google hosted libs), but load my copy of jQuery if the Google attempt fails? I'm not saying Google is flaky. There are cases where the Google copy is blocked (apparently in Iran, for instance). Would I set up a timer and check for the jQuery object? What would be the danger of both copies coming through? Not really looking for answers like "just use the Google one" or "just use your own." I understand those arguments. I also understand that the user is likely to have the Google version cached. I'm thinking about fallbacks for the cloud in general. Edit: This part added... Since Google suggests using google.load to load the ajax libraries, and it performs a callback when done, I'm wondering if that's the key to serializing this problem. I know it sounds a bit crazy. I'm just trying to figure out if it can be done in a reliable way or not. Update: jQuery now hosted on Microsoft's CDN. http://www.asp.net/ajax/cdn/

    Read the article

< Previous Page | 345 346 347 348 349 350 351 352 353 354 355 356  | Next Page >