Search Results

Search found 6362 results on 255 pages for 'django urls'.

Page 195/255 | < Previous Page | 191 192 193 194 195 196 197 198 199 200 201 202  | Next Page >

  • How do I objectively measure an application's load on a server

    - by Joe
    All, I'm not even sure where to begin looking for resources to answer my question, and I realize that speculation about this kind of thing is highly subjective. I need help determining what class of server I should purchase to host a MS Silverlight application with a MSSQL server back-end on a Windows Server 2008 platform. It's an interactive program, so I can't simply generate a list of URLs to test against, and run it with 1000 simultaneous users. What tools are out there to help me determine what kind of load the application will put on a server at varying levels of concurrent users? Would you all suggest separating the SQL server form the web server, to better differentiate the generated load on the different parts of the stack?

    Read the article

  • Employee Monitoring software

    - by nute
    I am looking for an employee monitoring solution, that would allow us to remotely connect to our computers to see what is happening live, and preferably having some recording capabilities such as snapshots, URLs visited, etc ... I've looked around the web and most softwares I found were from unknown companies, had crappy websites, and made me feel like their either wanted me to install a virus on my computer, or to scam me. Most also seemed to have planted "reviews" online most likely written by themselves. Basically, anyone has experience with a trustworthy company to accomplish that? Thanks

    Read the article

  • Why would image thumbnails not be showing in all search engines in all browsers?

    - by Edward Tanguay
    For over a week now, when I search for a word at Google, it correctly gives me some preview image thumbnails: But when I click on "images", it doesn't show me any thumbnails: The same thing happens at Bing.com, when I search Bing itself it gives me some thumbnails on the general search result page: But when I click on "images", it doesn't show me any thumbnails: The same thing happens at Yahoo: If I click on one of the broken thumbnails, it shows me the picture fine: Thumbnails at youtube also work fine: It seems each search engine is hanging on different URLs as shown above: t1.gstatic.com ts3.mm.bing.net thm-a02.yimg.com so it doesn't seem to be the problem of one specific URL not sending thumbnails, it is just a problem with search engine image thumbnails in general. Also, this happens in every browser I try: Explorer, Firefox, Chrome. What could be the problem? Is it my computer, some setting somewhere, my router, my Internet provider (T-Online, Germany)? Has anyone ever had this problem and solved it?

    Read the article

  • rewrite rule does not rewrite url as expected

    - by user1708687
    I have a problem with a CMS website, that normally generates readable urls. Sometimes it happens that navigation links are shown as www.domain.com/22, which results in an error, instead of www.domain.com/contact. I have not found a solution for this yet, but the page is working if the url is www.domain.com/index.php?id=22. Therefore, I'm trying to rewrite www.domain.com/22 to www.domain.com/index.php?id=22 and I have used this rewrite rule: RewriteRule ^([1-9][0-9]*)$ index.php?id=$1 [NC] I tested it using http://htaccess.madewithlove.be and here it shows the correct result, but on the website no rewrite is happening.

    Read the article

  • Nginx static files exclude one or some file extensions

    - by Evgeniy
    I'm serving up a static site via nginx. location ~* \.(avi|bin|bmp|dmg|doc|docx|dpkg|exe|flv|gif|htm|html|ico|ics|img|jpeg|jpg|m2a|m2v|mov|mp3|mp4|mpeg|mpg|msi|pdf|pkg|png|ppt|pptx|ps|rar|rss|rtf|swf|tif|tiff|txt|wmv|xhtml|xls|xml|zip)$ { root /var/www/html1; access_log off; expires 1d; } And my goal is to exclude requests like http://connect1.webinar.ru/converter/task/. Full view is like http://mydomain.tld/converter/task/setComplete/fid/34330/fn/7c2cfed32ec2eef6788e728fa46f7a80.ppt.swf. Despite the fact these URLs ends in such a format they are not static, but fake script requests, so I have a problems with them. What is the best way to do this? How can I add an exclusion for this URL or maybe I can to exclude the specific file exptension (.ppt.swf, pptx.swf) from the list of this Nginx location? Thanks.

    Read the article

  • How can I configure firefox to open links in the same window, but requests from external application

    - by Mnementh
    I hate it, when sites decide for me, which links should open in a new window, and which in the same. The back-button doesn't work. Good thing is, firefox has the option browser.link.open_newwindow. If I set this to 1, all links with target=blank open in the same window, as it should be. But now also clicks in external programs (like the email-client or newsreader) on links open this in the same window, destroying the already opened website. How can I configure firefox to open links in a website always in the same window, but in external programs opened URLs always in a new one?

    Read the article

  • How to enable customers to use their own domain for sites hosted by me [closed]

    - by Scott
    I am thinking of running a self-site builder. But was wondering how would I allow customers to use their own domains that they already own. Is that even possible? Let's say my site is www.bestsitebuildingwebsite.com and each customer has urls like this www.bestsitebuildingwebsite.com/frances www.bestsitebuildingwebsite.com/eden www.bestsitebuildingwebsite.com/john And a customer has a domain called widgets.com Is it actually possible domain widgets.com to go to my site somehow and have HASHES on the URL still work (my site makes use of hashes for AJAX queries). And their site still have good SEO with Google? Thanks Scott

    Read the article

  • Capture live streaming

    - by acidzombie24
    I want to capture a rtmp stream. The videos are live, different every day and usually i cant tune in because i am busy at work doing something :(. I would like to capture the stream however they use anti capturing techniques (its live and free so i dont understand why). I tried orbit downloader without any luck. The url seems kind of weird (judging by grab++). It has || in it and other urls. What applications can i use to capture this? i am open to using linux

    Read the article

  • Remote deploying wars to a liferay installation

    - by iftrue
    With vanilla tomcat, you can POST to URLs beneath SOMURL/manager/ with a proper manager user role defined. The liferay deployment of tomcat, however, is missing the manager and host-manager applications, and when I copy the directories from a vanilla Tomcat installation, I get the exception below: Exception: javax.servlet.ServletException: Error allocating a servlet instance org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:558) org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) java.lang.Thread.run(Thread.java:636) root cause java.lang.SecurityException: Servlet of class org.apache.catalina.manager.HTMLManagerServlet is privileged and cannot be loaded by this web application org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:558) org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) java.lang.Thread.run(Thread.java:636) What's the proper way to remote deploy wars to a liferay instance? (Not portlets, in my case.)

    Read the article

  • Apache whitelist a single location, but require basic auth for everything else

    - by Chris Lawlor
    I'm sure this is simple, but Google is not my friend this morning. The goal is: /public... is openly accessible everything else (including /) requires basic auth. This is a WSGI app, with a single WSGI script (it's a django site, if that matters..) I have this: <Location /public> Order deny,allow Allow from all </Location> <Directory /> AuthType Basic AuthName "My Test Server" AuthUserFile /path/to/.htpasswd Require valid-user </Directory> With this configuration, basic auth works fine, but the Location directive is totally ignored. I'm not surprised, as according to this (see How the Sections are Merged), the Directory directive is processed first. I'm sure I'm missing something, but since Directory applies to a filesystem location, and I really only have the one Directory at /, and it's a Location that I wish to allow access to, but Directory always overrides Location... EDIT I'm using Apache 2.2, which doesn't support AuthType None.

    Read the article

  • Screen startup apps

    - by stillinbeta
    I know that most people don't bother with things like screen anymore, but I happen to really like it, even in this GUI day and age. I still do most of my development from a BASH prompt, so it's extremely useful to me. What I'm wondering is what the easiest way is to start an instance of screen (stored in a shell script or .screenrc or somewhere else) so that it starts up with set commands already running in set windows. For example, I use a django test server, so I'd like one window to come up running "python manage.py runserver" and another blank, waiting for commands. The man page is wholly indecipherable. These old unix utilities can do quite nearly everything, so I'm sure this is possible, but I can't for the life of me figure out how. I

    Read the article

  • keeping URL domain the same when pointing A record to a hosting account

    - by kwight
    Hello, I have a new WordPress website and a legacy billing system. For technical reasons, they cannot be on the same hosting plan. The hosting account for billing (and the original abc.com website) also manages DNS and mail. I'm trying to incorporate the new website under the same domain, eg. abc.com (website, on a different hosting account) and billing.abc.com (billing). I assume the answer is having a different A record for abc.com. I currently have a CPanel shared hosting account to use for the website (but can upgrade if necessary). How would I set this up in CPanel, so that the URLs work properly? Do I need a dedicated IP and then add the domain as an add-on domain? Thanks

    Read the article

  • php mail not arrives at gmail, not at local server

    - by thomas
    The php mail function I am using does not work completely. It will sent mails to gmail easy enough. However, emails routed directly to my internally hosted exchange server are not getting through. The servers/domains are setup is as follows. URLs are registered with Network solutions (www.independentsservice.com & www.isco.net) NS directs all traffic to our ISP (Socket.net). Socket directs as follows: Mail to our local server FTP to our local server HTTP to our website hosted on Chihost.com Traffic to our local server goes through a Watchguard firewall which routes mail traffic to our locally hosted Exchange server. Is there some reason why exchange won't accept these emails? Thanks!

    Read the article

  • Problems getting Squirrelmail and passenger working on apache

    - by Kenneth
    I'm trying to have a setup where I want to run a squirrelmail and Passenger on the same apache server, having a url point to squirrelmail and everything else handled by passenger. I've gotten so far that both squirrelmail and passenger will run fine by themselves but when passenger is running it handles all urls. So far I've tried using Alias and Redirect to point a webmail/ url to squirrelmails directory but that does not work. Here is my httpd.conf file: <VirtualHost *:80> ServerName not.my.real.server.name DocumentRoot /var/www/sinatra/public # Does not work: #Redirect webmail/ /usr/share/squirrelmail/ #<Directory /usr/share/squirrelmail> # Require all granted #</Directory> <Directory /var/www/sinatra/public> Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • Rewrite rule Mod_Proxy truncate file name [duplicate]

    - by Valerio Cicero
    This question already has an answer here: Redirect, Change URLs or Redirect HTTP to HTTPS in Apache - Everything You Ever Wanted How to Know about Mod_Rewrite Rules but Were Afraid to Ask 5 answers I search online for the solution, but nothing :(. I write this simple rule RewriteRule ^(.*)$ http://www.mysite.com/$1 [P,NE,QSA,L] In mysite.it i have an .htaccess with this rule and it's ok, but if i have a link "http://www.mysite.it/public/file name.html" the server point to "http://www.mysite.it/public/file" I try many solution but i can't solve. I try this and many shades of... RewriteRule ^(.*)(%20)(.*)$ "http://www.mysite.com/$1$3" [P,NE,QSA,L] Thanks!

    Read the article

  • How to subscribe to a youtube feed from linux command line?

    - by Tim
    I want to subscribe to a youtube channel and automatically download new videos to my linux machine. I know I could do this e.g. with miro, but I will not watch the videos using Miro, want to choose the quality and would like to run it as a cronjob. It should be able to: know which feed entries are new and not download old entries resume (or at least redownload) failed/incomplete downloads from older sessions Are there any complete solutions for this? If not it would be enough for me (maybe even preferable) to just have a command line rss reader that remembers which entries have already been there and writes the new video urls (e.g. http://www.youtube.com/watch?v=FodYFMaI4vQ&feature=youtube_gdata from http://gdata.youtube.com/feeds/api/users/tedxtalks/uploads) into a file. I could then accomplish the rest using a bash script and youtube-dl. What would be programs usable for this purpose?

    Read the article

  • Error attempting to log into Redmine through IIS 7.5 Reverse Proxy

    - by dneaster3
    I am trying to set up Redmine as a subdirectory of our department's intranet site, and also to rebrand it as "Workflow" using IIS's URL Rewrite extension. I have it "working" in that it will serve the page with all the correct rewrites in both the URL and the HTML code. However, when I try to submit a form (including logging in to redmine), IIS gives me one of the the following errors: Your browser sent a request that this server could not understand. or The specified CGI application encountered an error and the server terminated the process. Here's the setup: Redmine installed on a local Windows XP machine using the Bitnami all-in-one installer, which includes: Apache 2 Ruby-on-Rails MySQL Redmine Thin Redmine runs locally at http:/localhost/redmine Redmine runs over the intranet http:/146.18.236.xxx/redmine Windows Server + IIS 7.5 serving up an ASP.NET intranet web application mydept.mycompany.com IIS Extensions Url Rewrite and AAR installed Reverse proxy settings for IIS (shown below) to serve Redmine at mydept.mycompany.com/workflow <rewrite> <rules> <rule name="Route requests for workflow to redmine server" stopProcessing="true"> <match url="^workflow/?(.*)" /> <conditions> <add input="{CACHE_URL}" pattern="^(https?)://" /> </conditions> <action type="Rewrite" url="{C:1}://146.18.236.xxx/redmine/{R:1}" logRewrittenUrl="true" /> <serverVariables> <set name="HTTP_ACCEPT_ENCODING" value="" /> <set name="ORIGINAL_HOST" value="{HTTP_HOST}" /> </serverVariables> </rule> </rules> <outboundRules rewriteBeforeCache="true"> <clear /> <preConditions> <preCondition name="isHTML" logicalGrouping="MatchAny"> <add input="{RESPONSE_CONTENT_TYPE}" pattern="^text/html" /> <add input="{RESPONSE_CONTENT_TYPE}" pattern="^text/plain" /> <add input="{RESPONSE_CONTENT_TYPE}" pattern="^application/.*xml" /> </preCondition> <preCondition name="isRedirection"> <add input="{RESPONSE_STATUS}" pattern="3\d\d" /> </preCondition> </preConditions> <rule name="Rewrite outbound relative URLs in tags" preCondition="isHTML"> <match filterByTags="A, Area, Base, Form, Frame, Head, IFrame, Img, Input, Link, Script" pattern="^/redmine/(.*)" /> <action type="Rewrite" value="/workflow/{R:1}" /> </rule> <rule name="Rewrite outbound absolute URLs in tags" preCondition="isHTML"> <match filterByTags="A, Area, Base, Form, Frame, Head, IFrame, Img, Input, Link, Script" pattern="^(https?)://146.18.236.xxx/redmine/(.*)" /> <action type="Rewrite" value="{R:1}://mydept.mycompany.com/workflow/{R:2}" /> </rule> <rule name="Rewrite tags with hypenated properties missed by IIS bug" preCondition="isHTML"> <!-- http://forums.iis.net/t/1200916.aspx --> <match filterByTags="None" customTags="" pattern="(\baction=&quot;|\bsrc=&quot;|\bhref=&quot;)/redmine/(.*?)(&quot;)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="true" /> <action type="Rewrite" value="{R:1}/workflow/{R:2}{R:3}" /> </rule> <rule name="Rewrite Location Header" preCondition="isRedirection"> <match serverVariable="RESPONSE_LOCATION" pattern="^http://[^/]+/(.*)" /> <conditions> <add input="{ORIGINAL_URL}" pattern=".+" /> <add input="{URL}" pattern="^/(workflow|redmine)/.*" /> </conditions> <action type="Rewrite" value="http://{ORIGINAL_URL}/{C:1}/{R:1}" /> </rule> </outboundRules> </rewrite> <urlCompression dynamicCompressionBeforeCache="false" /> Any help that you can provide would be appreciated. I get the impression that I'm close adn that it is just one little setting here or there, but I can't seem to make it work.

    Read the article

  • Xpath automatization software

    - by holms
    Too sad this topic was closed. But I'm kind of a having the same question. I want to construct xpathes, for common html block which appears on page. For example: you can give two URLs to that software, which will contain SAME html blocks (divs) , but having different content in it. by giving 2 stackoverflow.com url's, software could detect that same div#id is being used once again, and just give XPATH'es of those html blocks like for example. Of course I can find xpath'es my self, as far as I remember, firebug makes it easy,shows xpath of every html block, but this is kind of hard procedure if you want to get xpath'es for LOTS of html elements. so that's why I want this kind of software to help in this routine.

    Read the article

  • Need some help with Apache .htaccess

    - by Legend
    I am trying to setup an application that was built using the Zend framework. Let's say my subdomain is: http://subdomain.domain.com and that it points to the following: http://www.domain.com/projectdir/ The structure of the project dir is the following: application/ ... ... library/ ... ... public/ ... ... .htaccess The contents of the htaccess are: SetEnv APPLICATION_ENV production RewriteEngine On # skip existing files and folders RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] # send everything to index RewriteRule ^.*$ index.php [NC,L] While this works, the child objects on the page are being directed to the domain i.e., the image URLs (and the CSS files etc.) are broken because they are being redirected to something like: http://www.domain.com/images/image.png Can someone please tell me how to fix this?

    Read the article

  • Question about Domain Forwarding [beginner]

    - by Jack W-H
    Hello folks Just a quick beginner's question here. I have a webapp located at domainxyz.com, and it generates short URLs for long posts automatically - so rather than visit domainxyz.com/reallylongpostnamehere I can just type domainxyz.com/a5c and be taken there automatically. However, I've bought a shorter domain name - short.com - and I want to be able to visit short.com/a5c and be redirected (or forwarded) to domainxyz.com/a5c. Or short.com/7f0 -- domainxyz.com/7f0. This way, although it seems a tad illogical it saves me setting up another hosting account on short.com to deal with the URL shortening. Is this possible? I realise you can forward domains, but, can you forward domains AND forward the URL segments? Thanks! Jack

    Read the article

  • How can I allow a linux subversion user to only execute svnserve?

    - by sbleon
    I've got a user that I'd like to only be able to use subversion. We like to use svn+ssh:// URLs sometimes (for public keys and whatnot), so I need them to be able to connect over ssh and run only the svnserve command. When using a svn+ssh URL, svn ssh'es in and passes the arguments "-c svnserve -t". I wrote a custom shell as follows to filter the commands that can be run. This works, but it's not passing the input to svnserve, so when I try to "svn up" I get "svn: Connection closed unexpectedly". #!/bin/bash if [ "$1" == "-c" ] && [ "$2" == "svnserve" ] && [ "$3" == "-t" ] && [ "$4" == ""] ; then exec svnserve -t else echo "Access denied. User may only run svnserve." fi

    Read the article

  • Creating an office network and monitoring all activity without a proxy

    - by Robert
    We are setting up our office network and would like to track all the websites visited by our employees. However, we would not like to use any proxy based solutions. Our work is highly dependent on applications in which you cannot configure a proxy. Hence, the approach we would like to follow is setting up a router inside a computer (something like this : http://www.techrepublic.com/article/configure-windows-server-2003-to-act-as-a-router/5844624) This will also allow us to attach multiple ethernet cards and have redundancy in internet connectivity with complete abstraction from the user about which connection is being used. But most importantly, since all the traffic will be going through the computer (configured as a router) I assume there will be a way to run packet analysis on all the request / responses being made. For example, list all the FTP servers connected to (port 21), give a graph of all the URLs visited per day by frequency. Is there already a software which does this ? Or is it possible to build something like this ?

    Read the article

  • Routing to various node.js servers on same machine

    - by Dtang
    I'd like to set up multiple node.js servers on the same machine (but listening on different ports) for different projects (so I can pull any down to edit code without affecting the others). However I want to be able to access these web apps from a browser without typing in the port number, and instead map different urls to different ports: e.g. 45.23.12.01/app - 45.23.12.01:8001. I've considered using node-http-proxy for this, but it doesn't yet support SSL. My hunch is that nginx might be the most suitable. I've never set up nginx before - what configuration do I need to do? The examples of config files I've seen only deal with subdomains, which I don't have. Alternatively, is there a better (stable, hassle-free) way of hosting multiple apps under the same IP address?

    Read the article

  • It it possible to have multiple ReWrite rules that all do the same Action, for an IIS7.5 webserver?

    - by Pure.Krome
    I've got rewrite module working great for my IIS7.5 site. Now, I wish to add a number of urls that all goto an HTTP 410-Gone status. Eg. <rule name="Old Site = image1" patternSyntax="ExactMatch" stopProcessing="true"> <match url="image/loading_large.gif"/> <match url="image/aaa.gif"/> <match url="image/bbb.gif"/> <match url="image/ccc.gif"/> <action type="CustomResponse" statusCode="410" statusReason="Gone" statusDescription="The requested resource is no longer available" /> </rule> but that's invalid - the website doesn't start saying there's a rewrite config error. Is there another way I can do this? I don't particularly want define a single URL and ACTION for each url.

    Read the article

  • Trying to test Domain Collapsing / Consoldiation validity for SEO purposes

    - by Roy Rico
    At work, we're trying to determine the effectiveness of domain collapsing for SEO purposes. Our current structure is to have multiple web apps served from different servers, such as PUBLIC URLS - directly accessed by users www1.somecompany.com/webapp1 www2.somecompany.com/webapp2 www3.somecompany.com/webapp3 I'm proposing to put an Apache proxy in front of these applications that will mask the different domains and route the requests to proper server PUBLIC URL--------routed/forwarded to-----PRIVATE URL www.somecompany.com/webapp1 <-----> www1.somecompany.com/webapp1 www.somecompany.com/webapp2 <-----> www2.somecompany.com/webapp2 www.somecompany.com/webapp3 <-----> www3.somecompany.com/webapp3 In terms of SEO/page rank value, does this help?

    Read the article

< Previous Page | 191 192 193 194 195 196 197 198 199 200 201 202  | Next Page >