Search Results

Search found 3028 results on 122 pages for 'urls'.

Page 89/122 | < Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >

  • passing url in parameters mvc4

    - by user516883
    I have a site that collects urls. A full http url is enter into a texbox. I am getting a 400 error when a url is being passed in the parameter, It works fine with regular text. Using jquery how can I pass the full URL in my application. Thanks for any help. MVC Routing Config routes.MapRoute("UploadLinks", "media/upload_links/{link}/{albumID}", new { controller = "Media", action = "WebLinkUpload" }); Controller Action public ActionResult WebLinkUpload(string link, string albumID){} Jquery ajax call $('#btnUploadWebUpload').click(function () { $.ajax({ type: "GET", url: "/media/upload_links/" + encodeURIComponent($('#txtWebUrl').val().trim()) + "/" + currentAlbumID, contentType: "application/json; charset=utf-8", dataType: "json", success: function (result) { } }); });

    Read the article

  • Randomly Losing Session Variables Only In Google Chrome & URL Rewriting

    - by Toby
    Using Google Chrome, I'm seemingly losing/corrupting session data when navigating between pages (PHP 5.0.4, Apache 2.0.54). The website works perfectly fine in IE7/8, Firefox, Safari & Opera. The issue is only with Google Chrome. I narrowed down the problem. I'm using search friendly URL's, and hiding my front controller (index.php) via a .htaccess file. So the URL looks like: www.domain.com/blah/blah/ Here's the .htaccess file contents: Options +FollowSymlinks RewriteEngine on #allow cool urls RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*) index.php [L] #allow to have Url without index.php If I remove the .htaccess file, and expose the front controller in the URL: www.domain.com/index.php/blah/blah/, Chrome works perfectly fine. Any thoughts ideas? I'm thinking it's some kind of problem with how Chrome identifies what cookie to use and send to the server? This happens in Chrome 4 & 5. Thanks!

    Read the article

  • Best Website Statistics tool for Drupal

    - by Olav
    What is the best free Website statistics setup I can have for Drupal 6 on Apache? Particularities: 1. Multisite install. Might want to look over several sites. Should be able to restrict view for clients to their own site. Some hits are bypassing Drupal. Some urls are not public. Some sites have little traffic, it would be nice to be able to exclude "own" traffic. Logged in users are not so important (It seems Google Analytics is popular)

    Read the article

  • SQL SELECT with time range

    - by nLL
    Hi, I have below click_log table logging hits for some urls site ip ua direction hit_time ----------------------------------------------------- 1 127.0.0.1 1 20010/01/01 00:00:00 2 127.0.0.1 1 20010/01/01 00:01:00 3 127.0.0.1 0 20010/01/01 00:10:00 .... ......... I want to select incoming hits (direction:1) and group by sites that are: from same ip and browser logged within 10 minutes of each other occured more than 4 times in 10 minutes. I'm not sure if above was clear enough. English is not my first language. Let me try to explain with an example. If site 1 gets 5 hits from same ip and browser with in 10 minutes after getting first unique hit from that ip and browser i want it to be included in the selection. Basically I am trying to find abusers.

    Read the article

  • "digg" button and encoded url :S

    - by guest86
    Hi! I wrote a php site (it's still a prototype) and i placed a "Digg" button. Placing the button was easy but.... Official manual says "url has to be encoded". I did that with urlencode(). After urlencode, my url looks like this: http%3A%2F%2Fwww.mysite.com%2Fen%2Fredirect.php%3Fl%3Dhttp%3A%2F%2Fwww.othersite.rs%2FNews%2FWorld%2F227040%2FRusia-Airplane-crashed%26N%3DRusia%3A+Airplane+crashed So far, so good but when i want to submit that url to digg, it is recognized as invalid url: http://www.mysite.com/en/redirect.php?l=http://www.othersite.rs/News/World/227040/Rusia-Airplane-crashed&N=Rusia:+Airplane crashed If i place a "+" between "Airplane" and "crashed" (mere end of a link), then digg recognize it without any problems! Please help, this bizare problem is killing my braincells! P.S. for purpose of this answer urls are changed (nonexisting) because, in original, non-english sites are involved P.S.S. Happy New Year! :)

    Read the article

  • How can I request local pages in the background of an ASP.NET MVC app?

    - by flipdoubt
    My ASP.NET MVC app needs to run a set of tasks at startup and in the background at a regular interval. I have implemented each task as a controller action and listed the app-relative path to the action in the database. I implemented a TaskRunner process that gets the urls from the database and requests each one at a regular interval using WebRequest.Create, but this throws a UriFormatException. I cannot use this answer or any code that plucks values from HttpContext.Current.Request without getting an HttpException with the message "Request is not available in this context". The Request object is not available because my code uses System.Threading.Timer to do background processing, as recommended here. Here are my questions: Is there really no way to make local web requests within an ASP.NET web app? Is there really no way to dynamically ascertain the root path to the web app even using static dependencies in ASP.NET? I was trying to avoid storing the app's root path in the database (as FogBugz does with its "Maintenance Path"), but is this best option?

    Read the article

  • Can you detect a 301 redirect with Microsoft.XMLHTTP object?

    - by dmb
    I'm using VBScript and the Microsoft.XMLHTTP object to scrape some web data. I have a list of URLs to check, but unfortunately some of them 301 redirect to others on the list, so I wind up with redundant data. Is it at all possible to make the XMLHTTP object fail on 301 redirect? Or at least cache the original response header? Or otherwise just let me know what happened? (notes: I have no control over the server I'm requesting data from; when I get new data, I could check if it's redundant, but I'd like to avoid that if possible). Any ideas would be greatly appreciated.

    Read the article

  • PHP Script for redirecting to url

    - by Aruna
    hi, i am having a 2 Urls like http://www.abc.com and http://www.xyz.com. I am trying to redirect to http://www.xyz.com whenever i type http://www.abc.com in the browser. And also when the user types http://www.abc.com/index.php?option=com_content&view=article&id=46&Itemid=55 something like this ie. any query next to abc.com then i am trying to redirect to http://www.xyz.com/index.php?option=com_content&view=article&id=46&Itemid=55. how to do this in Php... Please help me..

    Read the article

  • mod rewrite, title slugs and htaccess

    - by chris
    I have been taken in to provide some seo guidance on a website which has been running since 2005. My problem is i want to use clean urls. The code that handles the url is hidden away in some class file.. and with over a few thousand lines of code its a struggle to rewrite it. So I'm think, I have gone through all the products and created a slug for them as a field in the product table. Is it possible to do something like an intermediate file for htaccess. Some thing like 1./clean-slug-comes-in/ 2.htaccess catches this and uses slug.php to find the relevant product id for the slug. 3.Then product.php?id=(ID.found.from.2) is loaded?

    Read the article

  • Caching HTML output with PHP

    - by Mohamed Amine
    Hi! I would like to create a cache for my php pages on my site. I did find too many solutions but what I want is a script which can generate an HTML page from my database ex: I have a page for categories which grabs all the categories from the DB, so the script should be able to generate an HTML page of the sort: my-categories.html. then if I choose a category I should get a my-x-category.html page and so on and so forth for other categories and sub categories. I can see that some web sites have got URLs like: wwww.the-web-site.com/the-page-ex.html even though they are dynamic. thanks a lot for help

    Read the article

  • Why do I get HTTP Code 414 on one network but not another?

    - by Stephen Darlington
    I have an otherwise working iPhone program. A recent change means that it generates some very long URLs (over 4000 characters sometimes) which I know isn't a great idea and I know how to fix -- that's not what I'm asking here. The curious thing is that when I make the connection using a 3G network (Vodafone UK) I get this HTTP "414 Request-URI Too Long" error but when I connect using my local WiFi connection it works just fine. Why would I get different results using different types of network? Could they be routing requests to different servers depending on where the connection originates? Or is there something else at stake here? The corollary questions relate to how common this is. Is it likely to happen whenever I use a cell network or just some networks?

    Read the article

  • Is there a Java equivalent for Ruby on Rails "url_for"?

    - by Oren Ben-Kiki
    I have written something like this pretty easily in C# (string GetUrl(new { controller = "foo", action = "bar", baz = "fnord" }), based on the existing capabilities of the XmlRouteCollection class provided by the ASP.NET MVC framework (why it isn't there out of the box is beyond me; the additional required code was trivial). I am now faced with a JSP project, and I need the same ability: centralize the logic for generating all URLs in one place, based on a list of routing rules. Is there some code somewhere I could reuse/adapt to do this in Java? It seems like a common enough requirement, but google proved surprisingly unhelpful in finding something like this.

    Read the article

  • Approaches to timing out sessions on a web app using AJAX autorefreshes

    - by Braintapper
    I'm writing a web application that autorefreshes data with an AJAX call at set intervals. Because it's doing that, server side user sessions never time out, since the last activity is refreshed with every ajax call. Are there good client side rules I could implement to time out the user? I.e. should I track mouse movements in the browser, etc., or should I point the AJAX calls to URLs that don't refresh the session? I like that my AJAX calls hit a session-enabled URL, because I can also validate that the user is logged in, etc. Any thoughts in terms of whether I should even bother timing out the users?

    Read the article

  • RewriteRule in htaccess in subdirectory

    - by Jay
    Windows server, running Apache. In my Apache conf, I have AllowOverride None for the root of a site and then I have a subdirectory set to AllowOverride All: <Directory /> AllowOverride None </Directory> <Directory "/safe/"> AllowOverride All </Directory> However, when I try to set up a rewrite rule in the subdirectory's htaccess file, nothing happens, I just get a 404 page not found error. Example: RewriteEngine On RewriteRule (.*) /blah?test=$1 [R=302,NC,NE,L] Rwewriting URLs are working fine from the root via the Apache conf. I don't understand why the rule is ignored. I don't want to do the URL re-writing within the conf because for this case I may need to be changing the redirects constantly and don't want to reload the server every time a change is made. I also don't want to affect server performance by enabling htaccess files site-wide, just in the subdirectory I need it.

    Read the article

  • Should I have a separate copy of all CakePHP files for every new application?

    - by BicMan
    I'm extremely new to CakePHP. From what I've gathered, it seems like I can have multiple applications that all share the same app and cake directories. So, let's say I have two applications. CakeFacebookApp and GenericCakeBlog. These applications are completely separate from each other and will have completely separate URLs, but they will reside on the same webhost. Should they both be within the same cake structure, or should they each have a full cake install in separate directories? Technically, I'm sure it will work either way, but I guess I'm looking for a best practice approach. Thanks.

    Read the article

  • Kohana v2 - problem with routes

    - by yoda
    Hi, I'm attempting to set some custom routes in Kohana v2. What I'm looking for is some method that allow the system to : follow a pre-defined name to it's related root; redirect the non-matched url's into another controller; To give you a more clear view over the problem, I want the first url parameter to be associated to certain pages (contacts, home, services, about us, etc), and those urls who doesn't match the previous pages would be routed into a controller called products, in order to determine if the first url parameter regards a product name. Here's a sample of what I have : $config['_default'] = 'home'; $config['([a-zA-Z]+)'] = 'products/showcat/$1'; What am I missing here? Thanks in advance!

    Read the article

  • How to store user-specific data in SharePoint

    - by Paul-Jan
    I have some user-specific data that I need to store in SharePoint and make accessible to the user through custom webparts. Let's say a list of favorite URLs. What would be the most straightforward way to store this information? Some builtin propertybag for SPUser or similar that I'm not aware of. SPList, associated through User column. Custom database table, associated through SPUser ID. Otherwise? Sounds like a RTFM to me, but I'm probably asking google the wrong questions.

    Read the article

  • Google App Engine - SiteMap Creation for a social network

    - by spidee
    Hi all. I am creating a social tool - I want to allow search engines to pick up "public" user profiles - like twitter and face-book. I have seen all the protocol info at http://www.sitemaps.org and i understand this and how to build such a file - along with an index if i exceed the 50K limit. Where i am struggling is the concept of how i make this run. The site map for my general site pages is simple i can use a tool to create the file - or a script - host the file - submit the file and done. What i then need is a script that will create the site-maps of user profiles. I assume this would be something like: <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://www.socialsite.com/profile/spidee</loc> <lastmod>2010-5-12</lastmod> <changefreq>???</changefreq> <priority>???</priority> </url> <url> <loc>http://www.socialsite.com/profile/webbsterisback</loc> <lastmod>2010-5-12</lastmod> <changefreq>???</changefreq> <priority>???</priority> </url> </urlset> Ive added some ??? as i don't know how i should set these settings for my profiles based on the following:- When a new profile is created it must be added to a site-map. If the profile is changed or if "certain" properties are changed - then i don't know if i update the entry in the map - or do something else? (updating would be a nightmare!) Some users may change their profile. In terms of relevance to the search engine the only way a google or yahoo search will find the users (for my requirement) profile would be for example by means of [user name] and [location] so once the entry for the profile has been added to the map file the only reason to have the search-bot re-index the profile would be if the user changed their user-name - which they cant. or their location - and or set their settings so that their profile would be "hidden" from search engines. I assume my map creation will need to be dynamic. From what i have said above i would imagine that creating a new profile and possible editing certain properties could mark it as needing adding/updating in the sitemap. Assuming i will have millions of profiles added/being edited how can i manage this in a sensible manner. i know i need a script that can append urls as each profile is created i know the script will prob be a TASK - running at a set freq - perhaps the profiles have a property like "indexed" and the TASK sets them to "true" when the profiles are added to the map. I dont see the best way to store the map - do i store it in the datastore i.e; model=sitemaps properties key_name=sitemap_xml_1 (and for my map sitemap_index_xml) mapxml=blobstore (the raw xml map or ror map) full=boolean (set true when url count is 50) # might need this as a shard will tell us To make this work my thoughts are m cache the current site map structure as "sitemap_xml" keep a shard of url count when my task executes 1. build the xml structure for say the first 100 urls marked "index==false" (how many could u run at a time?) 2. test if the current mcache sitemap is full (shardcounter+10050K) 3.a if the map is near full create a new map entry in models "sitemap_xml_2" - update the map_index file (also stored in my model as "sitemap_index" start a new shard - or reset.2 3.b if the map is not full grab it from mcache 4.append the 100 url xml structure 5.save / m cache the map I can now add a handler using a url map/route like /sitemaps/* Get my * as map name and serve the maps from the blobstore/mache on the fly. Now my question is does this work - is this the right way or a good way to start? Will this handle the situation of making sure the search bots update when a user changes their profile - possibly by setting the change freq correctly? - Do i need a more advance system :( ? or have i re-invented the wheel! I hope this is all clear and make some form of sense :-)

    Read the article

  • backslashes in url variables

    - by namtax
    Hi there I have set up my coldfusion application to have dynamic urls on the page, such as www.musicExplained/index.cfm/artist/:VariableName However my variable names will sometimes contain backslashes, such as www.musicExplained/index.cfm/artist/GZA/Genius This is causing a problem, because my application presumes that the slash in the variable name represents a different section of the website, the artists albums. So the URL will fail. I am wondering if there is anyway to prevent this from happening? Do I need to use a function that replaces slashes in the variable names with another character? Thanks

    Read the article

  • Problem With Inserts of multibyte (converted to utf-8) strings in the mysql tables of utf_unicode_ci encoding

    - by user381595
    http://domainsoutlook.com/sandbox/keyword/?s=http://bhaskar.com raw example of my keyword density analyser. Every keyword shows up properly with no problems in unicode conversions etc. Now, When I am adding these words to the database column of a table, the words show up as messed up. http domainsoutlook.com/b/site/bhaskar.com.html For example on this front end page if you see there is a keyword that is shown as a blank but still occurs on the website 8 times. (It isnt empty in the database though). I have checked and there is no problem with mysql_real_escape_String...because the output stays the same before and after the word is gone through mysql_real_escape_String. Another problem was that I wanted to fix my urls for arabic language. They should be showing up as /word-{1st letter of the word}/{whole word}.html but its showing as /word-{whole word}/{1st letter of the word}.html I really need answers for these two questions.

    Read the article

  • Can you control pinterest's "find image" results?

    - by anthony
    Rather than add Pin It buttons through our site, I would like to simply control what images show up in Pinterest's "Find Image" results if a user decides to pin one of our URLs. As of now, "Find Images" allows the user to scroll through the images it finds on the page so they can select which image to pin. The "found" images start with the first jpg in the html file, I'm assuming (could that be a bad assumption??). On our site, this forces a user to scroll through about 15 navigation and promotion images before arriving at the featured product image. Is there any way to specify this image to show first in those results? Maybe through a meta tag, or by adding a class or id to the element? Without a public Pinterest API, this seems like just guesswork, but I wanted to see if anyone else has run into this, or solved this. Thanks.

    Read the article

  • Session being reset when using sub-domain.

    - by Adam Witko
    Hi, I'm trying to use sub-domains in my ASP.NET website but I'm coming across a few problems with the session being reset. I've edited my hosts file to have 'localhost', 'one.localhost' and 'two.localhost'. I can go to any of these URLs and do what I need to do and login to my system. The session mode is defined as follows in the web.config: I'm using SQLServer as the website will be ran as a webfarm. What I'm finding is when I click something that causes a postback all the session is lost and a new session id is created, when this occurs my website is now 'localhost' rather than the logged in 'one.localhost' for example. Does anyone know what might be causing this? Cheers

    Read the article

  • Facebook application redirect on different URL's

    - by All is well
    Hi, I have made a facebook application for generating access_token. It's working fine except one problem with it's redirect url. Have set redirect Url : www.mysite.com/myfolder/ in my FB application. So it will work for all folders/files inside www.mysite.com/myfolder/ but I am having problem I want to use these urls for redirect: http://www.mysite.com/myfolder/index.php?mod=config&do=useradd http://www.mysite.com/myfolder/index.php?mod=config&do=useredit and it always redirecting me on: http://www.mysite.com/myfolder/index.php?mod=config which is wrong . Please help to resolve this issue.

    Read the article

  • Rewrite all URL requests to https://www.example.com/$1

    - by xylar
    I have two domains, example.com and example.co.uk, that use the same application on my server. I would like to rewrite the address of the URL depending on what the user types in. The only URLs I want are https://www.example.com and https://www.example.co.uk In my .htaccess file I have the following: # Turn on URL rewriting RewriteEngine On RewriteCond %{HTTP_HOST} ^example\.co.uk$ [NC] RewriteRule ^(.*)$ https://www.example.co.uk/$1 [L,R=301] RewriteCond %{HTTP_HOST} ^example\.com$ [NC] RewriteRule ^(.*)$ https://www.example.com/$1 [L,R=301] If I goto http://www.example.com it doesn't add the https, if I goto http://example.com it does. What is the best way of making the ReWriteCond match the www url?

    Read the article

  • PHP: URL detection (regexp) includes line breaks

    - by marco92w
    I want to have a function which gets a text as the input and gives back the text with URLs made to HTML links as the output. My draft is as follows: function autoLink($text) { return preg_replace('/https?:\/\/[\S]+/i', '<a href="\0">\0</a>', $text); } But this doesn't work properly. For the input text which contains ... http://www.google.de/ ... I get the following output: <a href="http://www.google.de/<br">http://www.google.de/<br</a> /> Why does it include the line breaks? How could I limit it to the real URL? Thanks in advance!

    Read the article

< Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >