Search Results

Search found 23346 results on 934 pages for 'clean url'.

Page 51/934 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • jQuery ajax post dynamic url insertion

    - by Kirill
    $.ajax({ type: "POST", url: "OMFG.php", data: info, success: function(){ }}); is what I'm using atm as a test and it works fine. I need to get the url from the link I'm clicking, so I do: var url = $(this).attr("href"); which works fine if I alert it out(the link includes http://samedomain.com/etc.php), but the ajax function doesn't post if I insert it into the ajax code: $.ajax({ type: "POST", url: url, data: info, success: function(){ }}); Please help, as I'm screwed without this working.

    Read the article

  • Clean checkout from TFS 2008

    - by Luis Medel
    Hi all, I want to pass a project to a colleague without SCC bindings to avoid accidental changes in my repository. Is it possible to do a clean checkout from a TFS 2008 repo? I'm going crazy finding such an option in VS2008. Thanks in advance.

    Read the article

  • Clean up upon the kill signal

    - by Begui
    How do you handle clean up when the program receives a kill signal? For instance, there is an application I connect to that wants any third party app (my app) to send a finish command. What is the best say to send that finish command when my app has been destroyed with a kill -9?

    Read the article

  • Creating Application URL for iPhone

    - by pion
    I am preparing to submit my iPhone app for approval. This is my first time. One of the requirements is "Application URL". I have done the following to create Application URL: Click Foo-Info.plist Right Click Information Property List" Click "Add Row" Select URL types Expand "URL Types" Expand Item 0 Type in "com.mycompany.Foo" in the Value field with "URL Identifier" key I am wondering if I do this correctly. Thanks in advance for your help.

    Read the article

  • How can I make clean search urls?

    - by newbie
    If I have search that has a lot of different options, then url becomes very long and looks very bad. Is there anyway to make urls look better? Using POST to make search would keep urls clean, but people couldn't share search urls.

    Read the article

  • How to get a Clean String in Javascript?

    - by streetparade
    i have a long String. With some German characters and lots of new lines tabs ect.. In a Selectbox user can select a text, on change i do document.getElementById('text').value=this.value; But this fails. I just get a "unterminated string literal" as error in JavaScript. I think i should clean the string. How can i do it in JavaScript?

    Read the article

  • JSLint reports unexpected use of '&' and '|' -- I'd like to clean this

    - by Zhami
    I'm trying to get my Javascript code 100% JSLint clean. I've got some JS code that I've lifted from elsewhere to create a UUID. That code has the following line: s[16] = hexDigits.substr((s[16] & 0x3) | 0x8, 1); This line incites JSLint to generate two error messages: 1) Unexpected use of '&' 2) Unexpected use of '|' I don't understand why -- I'd appreciate counsel regarding how to recode to eliminate the error message.

    Read the article

  • UIWebView comparing current and defined URL's with a loop depending on result

    - by Syleron
    I am trying to compare the current url in webView with a defined url say google.com so in theory.. NSURLRequest *currentRequest = [webView request]; NSURL *currentURL = [currentRequest URL]; would give us our current url... NSString *newurl = @"http://www.google.com"; this would give us the compared to defined url while (!currentURL == newurl) { //do whatever here because currentURL does not equal the newurl } This does not seem to work though.. solutions?

    Read the article

  • url validation in ruby on rails

    - by jpallavi
    1)Url field should also accept url as “www.abc.com”. If user enters url like this, it should be automatically appended with “http://” resulting in value saved in database as “http://www.abc.com”. If user enters url as “http://www.xyz.com” system should not append “http://”. User should be able to save url with “https://”. what is the code for it in ruby on rails?

    Read the article

  • php - clean URL

    - by tibin mathew
    Hai I want to create a web site with pure php. I want to hide the url parameters. I.e. I want to make my web site with clean urls. Is there is any way to do this with out using any framework? Is it curl help full to do this? Does any one give me a solution.

    Read the article

  • "Share on LinkedIn" widget chokes on encoded spaces in url param

    - by David Droddy
    Does anyone know why I am not able to include my own, URL encoded URL params with URL encoded spaces? See the URL on my jsBin page constructed from LinkedIn's example--I have added (%3FnestedParam%3Done%20space) at the end of the "URL" value. THEN, if you remove the encoded space (%3FnestedParam%3DoneSpace) it works fine: Try it out: http://jsbin.com/acosa3/3 Thanks!

    Read the article

  • A simple, clean web layout

    - by Shaun_web
    Ok, so I hate CSS/HTML graphic design... What do you guys recommend as a sample template or website that you think has great CSS and html layout? From my past experience it's best to get a page that has a white background and little dependency on graphics -- that's clean, and easy to modify ;-).

    Read the article

  • Extracting numbers from a url using javascript?

    - by stormist
    var exampleURL = '/example/url/345234/test/'; var numbersOnly = [?] The /url/ and /test portions of the path will always be the same. Note that I need the numbers between /url/ and /test. In the example URL above, the placeholder word example might be numbers too from time to time but in that case it shouldn't be matched. Only the numbers between /url/ and /test. Thanks!

    Read the article

  • Get current URL in Python

    - by Alex
    How would i get the current URL with Python, I need to grab the current URL so i can check it for query strings e.g requested_url = "URL_HERE" url = urlparse(requested_url) if url[4]: params = dict([part.split('=') for part in url[4].split('&')]) also this is running in Google App Engine

    Read the article

  • cURL works but PHP cURL fails to internet [migrated]

    - by wrk2bike
    Trying to diagnose an issue using PHP to cURL to an Internet location on a RedHat Linux server. cURL is installed and working, and: <?php var_dump(curl_version()); ?> shows all the correct information in the output. The issue is I can use PHP to cURL to localhost on the box itself, but not the Internet (see below). Normally I'd suspect the firewall, but I can cURL from the command line to the Internet without a problem. The box can also update it's own software packages, etc. What am I missing? My test is: <?php function http_head_curl($url,$timeout=30) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_TIMEOUT, $timeout); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_NOBODY, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $res = curl_exec($ch); if ($res === false) { throw new RuntimeException("cURL exception: ".curl_errno($ch).": ".curl_error($ch)); } return trim($res); } // Succeeds, displaying headers echo(http_head_curl('localhost')); // Fails: echo(http_head_curl('www.google.com')); ?>

    Read the article

  • Duplicate content issue after URL-change with 301-redirects

    - by David
    We got the following problem: We changed all URLs on our page from oldURL.html to newURL.html and set up 301-redirects (ca. 600 URLs) Google re-crawled our page, indexed all the new URLs (newURL.html), but didn't crawl the old URLs (oldURL.html) again, as there were no internal links pointing at those domains anymore after the URL-change. This resulted in massive ranking-drops, etc. because (i) Google thought oldURL.html has exactly the same content as newURL, causing duplicate content issues, and (ii) Google did not transfer the juice from oldURL to newURL, because the 301-redirect was never noticed. Now we reset all internal Links to the old URLs again, which then redirect to the newURLs, in the hope that Google would re-crawl the pages, once there are internal links pointing at them. This is partially happening, but at a really low speed, so it would take multiple months to notice all-redirects. I guess, because Google thinks: "Aah, I already know oldURL.html, so no need to re-crawl it. Possible solutions we thought of are ... Submitting as many of the old URLs to the index as possible via Webmaster Tools, to manually trigger a crawl. Doing that already Submitting a sitemap with all old URLs - but not sure if good idea, because Google does not seem to like 301-redirects in a sitemap ... Both solutions are not perfect - and we cannot wait for three months, just to regain our old rankings. What are your ideas? Best, David

    Read the article

  • Apache: Virtual Host and .htacess for URL Rewriting not working

    - by parth
    I have configured a virtual host in my local machine and every thing is working fine. Now I want to use SEO friendly urls. To achieve this I have used the .htaccess file. My virtual host configuration is: <VirtualHost *:80> DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined </VirtualHost> and my .htaccess file has: AllowOverride All RewriteEngine On RewriteBase /ypp/ RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 The above .htaccess setting is not working. After that I modified my virtual host setting and it is working. The new virtual host setting is: <VirtualHost *:80> RewriteEngine On RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 ServerAdmin [email protected] DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined <Directory "C:/xampp/htdocs/ypp"> AllowOverride All </Directory> </VirtualHost> Please let me know where I am going wrong in the .htacess file for url rewriting. I do not want to use the settings in virtual host, since for every change I have restart apache.

    Read the article

  • Apache: Virtual Host and .htacess for URL Rewriting not working

    - by parth
    I have configured virtual host in my local machine and every thing working fine . Now I want to use SEO friendly urls. To achive this I have used .htacess file . My virtual host configuration is : <VirtualHost *:80> DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined </VirtualHost> and my .htacess file has : AllowOverride All RewriteEngine On RewriteBase /ypp/ RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 The above .htacess setting is not working . After that I have modigied my virtual host setting and it is working . new virtual host setting is : <VirtualHost *:80> RewriteEngine On RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 ServerAdmin [email protected] DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined <Directory "C:/xampp/htdocs/ypp"> AllowOverride All </Directory> </VirtualHost> Please guide me where I am wrong in .htacess file for url rewriting . I donot want to use setting in virtual host because for every change I have restart apache .

    Read the article

  • URL length and content optimised for SEO

    - by Brendan Vogt
    I have done some reading on what URLS should look like for search engine optimisation, but I am curious to know how mine would like, I need some advice. I have a tutorial website, and my categories is something like: Web Development -> Client Side -> JavaScript So if I have a tutorial called "What is JavaScript?", is it good to have a URL that looks something like: www.MyWebsite.com/web-development/client-side/javascript/what-is-javascipt Or would something like this be more appropriate: www.MyWebsite.com/tutorials/what-is-javascipt Just curious because I also read that it is wise to have keywords in your URLs. Do I need to add the identifiers of each categories in the link as well, something like: www.MyWebsite.com/1/web-development/5/client-side/15/javascript/100/what-is-javascipt 1 is the unique identifier (primary key) of category web development 5 is the unique identifier (primary key) of category client side 15 is the unique identifier (primary key) of category javascript 100 is the unique identifier (primary key) of tutorial what is javascript UPDATE This is not a programming question so can someone please help migrate this to the correct Q&A site without devoting my questions?

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >