Search Results

Search found 3028 results on 122 pages for 'urls'.

Page 94/122 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • webtrends Rest API - Example ?

    - by Asap
    Hi! I have to build a custom dashboard that presents some data from a Webtrends account. Can i get this information via api ? : 1) View and visitors for all pages and View/visitors for a single page (in a period range) 2) Stored Pages ( a list of all urls saved for my site in webtrends .. so i can choose for which one get more information in point 1) ) 3) Overall like point 1) but from the start of the web site until now. Thank you!

    Read the article

  • How to send HTML as GET-Request parameter?

    - by Mork0075
    I would like to send a html string with a GET request like this with Apaches HttpClient: http://sample.com/?html=<html><head>... This doesnt work at the moment, i think its an encoding problem. Do you have any ideas how to do that? method.setQueryString(new NameValuePair[] {new NameValuePair("report", "<html>....")}); client.executeMethod(method) This fails with org.apache.commons.httpclient.NoHttpResponseException: The server localhost failed to respond. If i replace "<html>" by "test.." it works fine. EDIT It seams to be a problem of URL length after encoding, the server doesnt except such long URls. Sending it as POST solves the problem.

    Read the article

  • .htaccess for multiple application in Kohana V3

    - by khairil
    Hi I have setup multiple application in Kohana v3, it works normally without enabling .htaccess (to remove index.php or admin.php) my setup + system/ + modules/ + applications/ + public/ + bootstrap.php + ... + admin/ + bootstrap.php + ... + index.php (for 'public' application) + admin.php (for 'admin' application) so to access the frontend sample url will be; http://mydomain.com/index.php/(controller_name)/... and to access administration site; http://mydomain.com/admin.php/(controller_name)/... The task is, I want to remove and replace index.php (default url) and admin.php with /admin/ using .htaccess (mod_rewrite) so it can be http://mydomain.com/(controller_name) <- 'public' application http://mydomain.com/admin/(controller_name) <- 'admin' application my current .htaccess (not working) is; # Turn on URL rewriting RewriteEngine On # Installation directory RewriteBase /ko3/ # Protect hidden files from being viewed Order Deny,Allow Deny From All # Protect application and system files from being viewed RewriteRule ^(?:web-apps|modules|core)\b.* index.php/$0 [L] # Allow any files or directories that exist to be displayed directly RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d # TODO: rewrite admin URL to admin.php/URL #RewriteRule ^admin/(.*) admin.php/$0 [L] # Rewrite all other URLs to index.php/URL RewriteRule .* index.php/$0 [PT]

    Read the article

  • One code on dev server, and production server - how to deal with links?

    - by Yegor
    I have a copy of a code running ont he prod server, and I sue my local machine(s) running xampp as a dev server. I have several websites that I actively develop, so Im forced to use http://localhost/sitename All my URLs are relative to the domain, (/file.php). They work fine on the prod server, but on a local server, they all point to localhost, when I want to make them all work relative to the site folder they are in. Is there anything I could do, other than what I do now, which is this: if($_SERVER['SERVER_NAME'] == "localhost") { $path_to = "http://" . $_SERVER['SERVER_NAME'] . "/folder"; $path_to_files = $_SERVER['DOCUMENT_ROOT'] . "/folder"; } else { $path_to = "http://" . $_SERVER['SERVER_NAME']; $path_to_files = $_SERVER['DOCUMENT_ROOT']; } and simply putting $path_to before each link on the site.

    Read the article

  • disallow certain url in robots.txt

    - by chrism
    We implemented a rating system on a site a while back that involves a link to a script. However, with the vast majority of ratings on the site at 3/5 and the ratings very even across 1-5 we're beginning to suspect that search engine crawlers etc. are getting through. The urls used look like this: http://www.thesite.com/path/to/the/page/rate?uid=abcdefghijk&value=3 When we started we add the following to our robots.txt: User-agent: * Disallow: /rate Is this incorrect or are googlebot and others simply ignoring our robots.txt?

    Read the article

  • Using PHP to determine if a remote file has been replaced?

    - by Rob
    I have a MySQL database with some URLs in it. One URL per row. Each URL has my script on it. What I am wanting to do, is check if the file is still there via a PHP script. Not check if it 404'd, but rather check if it has been modified or replaced. Is this possible? If so, how would it be accomplished? I was thinking making the remote file echo some string, and having the local file check the page for that string, but that seems a little inefficient and sloppy.

    Read the article

  • Is there any way to generate a set of JWebUnit tests from an apache rewrite config?

    - by robbbbbb
    Seems unlikely, but is there any way to generate a set of unit tests for the following rewrite rule: RewriteRule ^/(user|group|country)/([a-z]+)/(photos|videos)$ http:/whatever?type=$1&entity=$2&resource=$3 From this I'd like to generate a set of urls of the form: /user/foo/photos /user/bar/photos /group/baz/videos /country/bar/photos etc... The reason I don't want to just do this once by hand is that I'd like the bounded alternation groups (e.g. (user|group|country)) to be able to grow and maintain coverage without having the update the tests by hand. Is there a rewrite rule or regex parser that might be able to do this, or am I doing it by hand?

    Read the article

  • Using Mod_Rewrite in HTTPD.CONF file

    - by Mike Lovely
    I want to rewrite URLs so when a user goes to; http://www.example.com/applications/newWeb/www/index.php?page=48&thiscontent=2660&date=2013-10-11&pubType=0&PublishTime=09:30:00&from=home&tabOption=1 and if the URL contains thiscontent=2660 (which in this example above, it does) I want to redirect them to; http://www.example.come/index.php/publications/finance-and-economics/departmental-resources I have about 30 different thiscontent=XXXX types and imagine I’ll have to copy and edit this rule 30 different times for any links to my old website still knocking around out there. I have access to my httpd.conf file but have never done a mod_rewrite before. I also don't really need these showing up in the error logs as 301s. Will that happen? Because at the moment there are hundreds!

    Read the article

  • Removing exception

    - by Nikhil K
    I have used this code for extracting urls from web page.But in the line of 'foreach' it is showing Object reference not set to an instance of an object exception. What is the problem? how can i correct that? WebClient client = new WebClient(); string url = "http://www.google.co.in/search?hl=en&q=java&start=10&sa=N"; string source = client.DownloadString(url); HtmlDocument doc = new HtmlDocument(); doc.LoadHtml(source); foreach (HtmlNode link in doc.DocumentNode.SelectNodes("//a[@href and @rel='nofollow']")) { Console.WriteLine(link.Attributes["href"].Value); }

    Read the article

  • Redirect issues

    - by pinniger
    Whoever wrote the navigation for the site I’m currently working on (classic asp) points the navigation links to a folder, then inside to folder has an index.asp file, so the urls will look something like this www.mysite.com/myfolder/mysubfolder Now, when watch the page load using httpfox, I notice that the first entry is a 302 redirect to the same address with a “/” on the end, so www.mysite.com/myfolder gets redirected to www.mysite.com/myfolder/ (note the / on the end). I’m not to worried that it’s a 302 since its in the admin section of the site, but when I forward the host headers from ISA server, for an https request, its being redirected from https://www.mysite.com/myfolder to http://www.mysite.com:443/myfolder/ and causing all kinds of problems. Anyway, I can’t seem to find any code making this redirect happen, so does IIS 6 do this because the url points to a folder? Or do I need to comb through the code more closely?

    Read the article

  • Replace Emails and HREFS with enclosing HREFS in C#

    - by Nissan Fan
    I have an Email body that used to be plain text, but now I've made it HTML. The emails are generated using a number of methods and none of them are easy to convert. What I have is: Some content [email protected], some http://www.somewebsite/someurl.aspx. What I'd like to do is create a function that automatically encloses all email addresses and all URLs withing a string in HREF tags so that the HTML email reads properly in all email clients. Does anyone have a function for this?

    Read the article

  • Ensuring uniqueness on a varchar greater than 255 in MYSQL/InnoDB

    - by Vijay Boyapati
    I have a table which contains HTML entries for news pages. When I initially designed it I used URL as the primary key. I've learned the error of my ways because left-joining is super slow. So I want to redesign the table with an integer (id) primary key, but still keep the rows unique based on the URL. The problem is that I've found URLs longer than 255 characters, and MySQL isn't letting my create a key on the URL. I'm using an InnoDB/UTF8 table. From what I understand it's using multiple bytes per character with a limit of 766 bytes for the key (in InnoDB). I would really love suggestions on an elegant way of keeping the rows unique based on URL, while using an integer primary key. Thanks!

    Read the article

  • .htacess windows problem

    - by pistacchio
    Hi, In the root directory of a small site i'm developing i have the following .htacess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-F RewriteRule .$ index.php that basically allows me to have pretty urls as it remaps every path that does not exists on the server (eg : /user/details/145 ) as a call to index.php where I handle it MVC-style. While this works ok on MacOs, this morning I have to work on another machine that has Windows (the apache server is run by xampp) and it does not work as it seems to redirect all the calls (eg those to static files like images) to index.php Any help?

    Read the article

  • Placing .htaccess file in var/www folder messes my website up...

    - by Camran
    I am playing with mod_rewrite now, and have successfully enabled it. However, I need to put a htaccess file inside var/www/ in order to achieve what I want, which is to rename Urls simply... When I place it my website becomes very strange and nothing basically works... Is there any code I need to put into the htaccess file in order for things to act normally? Here is the htaccess file I have so far: Options +FollowSymLinks Options +Indexes RewriteEngine On RewriteCond %{REQUEST_URI} !^/ad\.php RewriteRule ^(.*)$ ad.php?ad_id=$1 [L] My DocumentRoot is also set to var/www/ and my entire website root is there... (index.html etc etc)... What am I missing about the htaccess? If you need more input let me know...

    Read the article

  • How to use window.location.replace javascript?

    - by william
    My URLs http://www.mysite.com/folder1/page1.aspx http://www.mysite.com/folder1/page1.aspx?id=1 http://www.mysite.com/folder1/page1.aspx?id=1&dt=20111128 Redirecting Page http://www.mysite.com/folder1/page2.aspx I want to redirect from page1.aspx to page2.aspx How to write a javascript in page1.aspx? window.location.replace("/page2.aspx"); window.location.replace("../page2.aspx"); window.location.replace("~/page2.aspx"); First 2 gave me this. http://www.mysite.com/page2.aspx Last 1 gave me this. http://www.mysite.com/folder1/~/page2.aspx What is the correct way to use?

    Read the article

  • Apache mod_rewrite help with Wordpress

    - by protohominid
    I administer my wife's site, namelymarly.com. Up until last week, the root page of the blog was namelymarly.com/blog/. Last week I changed it in the WP settings to be namelymarly.com. WP created the new htaccess file, and I moved the index.php to the root directory (but left the WP folder where it was in the /blog/ directory), as instructed. Everything is working great except for one very important thing: When you type 'namelymarly.com/blog/' into a browser now, you get a 404 error. All other URLs, when they include the '/blog/somethinghere', will redirect properly to '/somethinghere.' It's only when there's nothing after '/blog/' that there's a problem. I tried adding this rule but it breaks the site: RewriteRule ^/blog(/|)$ / Any suggestions/help?

    Read the article

  • Why is Drupal writing to root and not sites/default/files?

    - by Candland
    I'm using Drupal 6.14 on Win7. Everything seems to work except files that should be written to sites/default/files are trying to be written to /. The site was moved from a linux installation, which is writing the files correctly. I have setup a web.config w/ the rewrite rules for drupal. Not sure what or where else I should check. Thanks for any help. <rule name="Drupal Clean URLs" stopProcessing="true"> <match url="^(.*)$" /> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" /> </conditions> <action type="Rewrite" url="index.php?q={R:1}" appendQueryString="true" /> </rule>

    Read the article

  • Would popup blockers stop a URL which pops up only when the user clicked on something?

    - by tomeaton
    I'm currently building a web application that can can track a users actions on a particular website and pop a URL if the user takes certain actions, such as: first click, responding to a question by clicking yes / no, clicking a submit button, or exiting the site. It is important that these URLs are served to the user and are not blocked by pop-up blockers. It is my understanding that there are certain exceptions within the major internet browsers that allow pop-ups if they are served based on some user action, rather than serving an unsolicited pop? Is this true? How do I design this web application so that it can serve these pops (and not have them blocked).

    Read the article

  • Replace non-html links with <A> tags

    - by tombazza
    I have a block of code that will take a block of text like the following: Sample text sample text http://www.google.com sample text Using the preg_replace_callback method and the following regular expression: preg_replace_callback('/http:\/\/([,\%\w.\-_\/\?\=\+\&\~\#\$]+)/', create_function( '$matches', '$url = $matches[1]; $anchorText = ( strlen($url) > 35 ? substr($url, 0, 35).\'...\' : $url); return \'<a href="http://\'. $url .\'">\'. $anchorText .\'</a>\';'), $str); Will convert the sample text to look like: Sample text sample text < a href="http://www.google.com"http://www.google.com< /a sample text My problem now is that we have introduced a rich text editor that can create links before being sent to the script. I need to update this piece of code so that it will ignore any URLs that are already inside an tag.

    Read the article

  • How do I create a no-javascript url hash handler for my website?

    - by Kenny Bones
    Ok, I'm not sure how this is normally done. But I've got a script that basically empties a div of content and then loads content from a div from a separate webpage, without reloading the current page. This works great. It's taken from this example actually, from net tuts (great site btw) http://nettuts.s3.amazonaws.com/011_jQuerySite/sample/index.html And the guy who wrote this even though about handling the url's since the url don't change when using his method. So he wrote a javascript snippet that looks up the url and loads the content accoringly. Which is not working btw. But I was thinking about people who don't have javascript enabled, or iPhone and iPad users ;) Copying URLs and sending to a friend won't work at all. So how is this typically done? And can it be done without javascript? Possibly by php?

    Read the article

  • case sensititivity with users controller on certain hosting

    - by Leo
    We generally use two different hosting services. On one, everything works ticketyboo, as it does on my local dev servers. On the other server, however, I am having this problem: I can't access the users controller like this: http://www.example.com/users/login But I can like this: http://www.example.com/Users/login ** note the capitalised 'Users' ** If I displace the application to a sub-folder everything works fine (both upper- and lowercase). The hosting company have looked at it and can't see a problem at their end and they assure me that users is not a reserved word. You might say this isn't a problem, just use the version that works. Unfortunately it leads to problems downstream where Cake core starts generating urls itself. Anybody else seen this problem or know the solution? [This only occurs on the users controller - all others work as expected]

    Read the article

  • Drupal: does removing these lines from .htaccess cause security issues ?

    - by Patrick
    hi, I had to comment these lines from the htaccess files in my main Drupal folder and in sites folder # Don't show directory listings for URLs which map to a directory. #Options -Indexes # Follow symbolic links in this directory. #Options +FollowSymLinks ...in order to not get a 500 Internal Error on the new server. Can I leave them uncommented or am I going to have security issues ? ps. I've also set all content in files folder 777 permission. Is this ok ? thanks

    Read the article

  • help with php MVC problem

    - by aprencai
    hello, I have groups on my site and the urls have the following hierarchy: / groups / {id_group} / / groups / {id_group} / news / groups / {id_group} / gallery / groups / {id_group} / events / groups / {id_group} / events / {id_event} / groups / {id_group} / events / {id_event} / news / groups / {id_group} / events / {id_event} / gallery / groups / {id_group} / events / {id_event} / news As you can see a group can have news, gallery, etc. and in turn an event that is in a group can also have news, gallery, etc. How to implement this approach in a framework without specifying any specific one?, ie I would like some guidance on what would have modules, controllers, etc. Thanks.

    Read the article

  • Replace Emails and HREFS with enclosing HREFS

    - by Nissan Fan
    I have an Email body that used to be plain text, but now I've made it HTML. The emails are generated using a number of methods and none of them are easy to convert. What I have is: Some content [email protected], some http://www.somewebsite/someurl.aspx. What I'd like to do is create a function that automatically encloses all email addresses and all URLs within a string in HREF tags so that the HTML email reads properly in all email clients. Does anyone have a function for this?

    Read the article

  • How do I quiet image_submit_tag from params hash?

    - by Alan S
    Does anyone know how to eliminate the x and y params when you use image_submit_tag with a get method? I have a simple search form, and using get to pass the value in the url. When I use image_submit_tag, it also appends the x and y coords, so I get urls like http://example.com?q=somesearchterm&x=15&y=12 When I have used submit_tag, I can use the :name = nil attribute (was in one of Ryan Bates' Railscasts), but it doesn't seem to work for image_submit_tag. Granted it doesn't affect functionality, but I don't need them and would like them quieted.

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >