Search Results

Search found 4795 results on 192 pages for 'indian websites'.

Page 47/192 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • performance monitor in iis 7 to monitor which website is using most resources (asp.net)

    - by Karl Cassar
    I am using Windows Server 2008 R2 and IIS 7.5, and am hosting multiple websites on the same webserver. Is it possible to use Performance Monitor to know on average which website is using the most resources? I've added a user-defined Data Collector Set in Performance Monitor collecting data for 1 day. However, I could not find any details which hint which website is using the most resources. Which counters are crucial to monitor websites? The generated report tells me that the top process is w3wp##1 - how can I know which website it corresponds to? I've also tried to add counters for ASP.Net Applications for all object instances, however % Managed Processor Time (estimated) is 0 at all times.

    Read the article

  • Where can I learn various hacking techniques on the web?

    - by Carson Myers
    I would like to try my hand at hacking -- that is, exploiting various website vulnerabilities. Not for any illegal purpose mind you, but so I can have a better understanding and appreciation of these exploits while writing my own web software. I seem to recall that there was a community that hosted a bunch of demo websites, and you had to find and exploit certain vulnerabilities with each one. I can't remember what it is called but this is the sort of thing I am looking for -- I have read a tonne of little XSS and CSRF examples but have yet to find a real-life hands-on example of one. Does anyone know of such a place, where I can be given an example page and look for security holes? I would really rather not try this with actual websites, I don't want to break any laws.

    Read the article

  • Why would a server not send a SYN/ACK packet in response to a SYN packet

    - by codemonkey
    Lately, we've become aware of a TCP connection issue that is mostly limited to mac and Linux users who browse our websites. From the user perspective, it presents itself as a really long connection time to our websites (11 seconds). We've managed to track down the technical signature of this problem, but can't figure out why it is happening or how to fix it. Basically, what is happening is that the client's machine is sending the SYN packet to establish the TCP connection and the web server receives it, but does not respond with the SYN/ACK packet. After the client has sent many SYN packets, the server finally responds with a SYN/ACK packet and everything is fine for the remainder of the connection. And, of course, the kicker to the problem: it is intermittent and does not happen all the time (though it does happen between 10-30% of the time) We are using Fedora 12 Linux as the OS and Nginx as the web server.

    Read the article

  • Malicious program changing my DNSs

    - by julio.alegria
    Some weeks ago I started having problems with my internet connection, it was extremely slow and suddently some websites (specifically gmail, facebook, youtube and twitter) started failing to connect, while the rest connect normally. Some days after, those same websites started showing me a message in portuguese: "Nova atualização disponível" whenever I tried to connect and a .exe file started downloading ("internet_update.exe" or something like that). That's when I freaked out! It was definitely a virus or something like that, but it was really weird because I never had a problem like that (I run Linux). So I turned on my old PC (running Windows XP) and it turned out it had the same problem! the same message was showed whenever I tried to connect one of those specific websites, while the rest loaded without problems. Even in my Android smarthphone the same message was showed. So it was obvious that the problem was not in a particular machine but in the router itself. So I started googling and I found some information, unfortunately I only found some in spanish, so I will make you a short summary: It is a new banking trojan developed specifically to infect and collect information from Brasilian banks. Apparently now it has expanded to Argentina and Peru. So how does it work? It spreads through social networks (videos, links, ...) and then it "takes control" of your internet connection by changing the values of your DNSs. More specifically, it changes the Primary DNS to one of this IPs: 108.170.13.38, 66.7.216.122 or 63.143.43.154 and the Secondary DNS to 8.8.8.8, this secondary DNS is actually the Google Public DNS, and it is configured this way so that your internet connection continue working properly and the user does not notice anything. The important part here is that because no download or install has been made in your machine, no antivirus will notice any change. After your DNSs have been changed, the trojan controls every single website you connect to and this way they steal your bank information. So after reading about this I accesed to my router and I restored my Primary and Secondary DNSs to their proper values, but one day after I had the same problem again. This is actually a 50% warning post - 50% help me! post. So, here comes the question: Is there any possible way to prevent my DNSs of being changed?

    Read the article

  • PNG files are not being displayed in IE and Vista's Sidebar

    - by Wbdvlpr
    Hi, I am using Windows Vista. I was just visiting a website and found some "missing" images boxes on webpages. And, I could see this for many of the websites I visited. Then I realised that only a certain type of image files are not being displayed, which is PNG. I restarted my computer and noticed that 2 of the sidebar gadgets were missing background images. The websites with "missing" images are working fine in Firefox though. So its a problem related to IE and some of the Windows files. Any ideas how do I get PNGs working in my IE and Sidebar etc. Thanks.

    Read the article

  • Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it can

    - by Chris Altman
    This is a theoretical problem I am curious about. Websites are hosted on servers. Servers can be taken offline. Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it cannot be taken down? One example, is that the website is hosted on other websites, like a parasite. Another is that it is assembled through storing pieces on DNS machines, routers, etc., so that it get assembled on the fly. The purpose is that this website could live forever because no one person can remove it. The answers I am looking for are plausible idea/approaches on technically how this could be built.

    Read the article

  • Internet connectivity issues with one router but work ok with other router

    - by user825904
    I have one Tplink ADSL ROUTER and when i enter username and password on setup page then everything works fine. Now i have one more router Netgear router then when i enter same username and password then interworks ok for some 50% websites but for other 50% websites the page is not loaded and it hangs there. In the sats bar it says website found , waiting for reply and it hnags there and no site is displayed. I wonder which setting is different on these two routers. The Tplink router i have bought is from local shop but netgear router is from different country. Can that make some difference

    Read the article

  • If I use openvpn, can vpn servers monitor my usernames/passwords?

    - by Duff
    Openvpn uses a type of encryption similar to ssl. This seems to suggest that even if I choose an incredibly shady VPN server, then my content will be secure. That is, the VPN server will be able to monitor what websites I patronize, but not the actual data I transfer. That said, I am not an expert at this type of thing. I wanted, therefore, to make sure that I understand correctly. Is it true that if I use openvpn that my username/passwords are secure, even if the VPN is untrustworthy? If not, why? (And how, if at all, can it be fixed?) Examples of things that I don't know much about that may (or may not! I honestly don't know much about this.) be related to my question are: DNS leakage, IPv6, tracking cookies, browser plugins and websites that don't support https.

    Read the article

  • Tomcat 5.5 - multiple contexts using same path

    - by ctn8iv
    Is it possible to set up multiple contexts using the same path? For example: <Context docBase="/www/websites/site1/java/base" path="/base" reloadable="true"/> <Context docBase="/www/websites/site2/java/base" path="/base" reloadable="true"/> I have two sites that use the same path both running on the same server/IP. The sites use different virtual hosts and different ServerNames, but I have no control over the directory structure of the sites because they are maintained by a client. Until now, they have been content with only allowing one site to run at a time, but this is a major hassle, so I need to know if there's a workaround.

    Read the article

  • IIS 7.5 401 -UnAuthorized Access on a Virtual Directory

    - by Jimmy
    I have setup a website in IIS 7.5 on a Windows 2008 machine. The website is sitting on C:/websites/ Then I added a virtual directory called "/uploads" that points to "d:/websites/uploads". This directory holds all the images/media. When I browse the website in browser, I dont see any images etc. When I browse an image directly I notice that it's throwing a 401 error. 401 - Unauthorized: Access is denied due to invalid credentials. I have searched Google quite a lot and I am pretty sure I am have all the permissions setup correctly. Can anyone tell me what I could be doing wrong here?

    Read the article

  • Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it cannot be taken down? [closed]

    - by Christopher Altman
    This is a theoretical problem I am curious about. Websites are hosted on servers. Servers can be taken offline. Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it cannot be taken down? One example, is that the website is hosted on other websites, like a parasite. Another is that it is assembled through storing pieces on DNS machines, routers, etc., so that it get assembled on the fly. The purpose is that this website could live forever because no one person can remove it. The answers I am looking for are plausible idea/approaches on technically how this could be built.

    Read the article

  • How do I repair the .Net Framework on Windows 7

    - by Micah
    So I ran spyware doctor and it had a bunch of malware stuff. I clicked "Remove" but failed to create a restore point first. Now my websites running on .net 2.0 as well as visual studio 2008 are not working. My websites running .net 4 and visual studio 2010 are working just fine. I'm assuming I need to restore either .net 2.0/3.5 or something. Any idea on how to do this? Thanks!

    Read the article

  • what config files need to be transferred while migrating apache vhosts from old suse server to new suse server?

    - by jarus
    I have an old server with suse on it and its hosting numerous website under same IP , now i am trying to migrate the websites and all the contents of the old suse server to a new server with open suse 12.1 , i have transferred "/srv/www/vhosts" "/etc/apache2/vhosts.d" "/etc/apache2//httpd.conf" "/etc/apache2/listen.conf" "/etc/apache2/default-server.conf" i have transferred all the database files also . i am trying to replace the old server with the new server , i tried changing the ip address with the old server's ip address but its not working. what files do i need to transfer and what do i need to do to get the new server hosting the websites in place of the old server , please, any help will be greatly appreciated.

    Read the article

  • Adding to database with multiple text boxes

    - by kira423
    What I am trying to do with this script is allow users to update a url for their websites, and since each user isn't going to have the same amount of websites is is hard for me to just add $_POST['website'] for each of these. Here is the script <?php include("config.php"); include("header.php"); include("functions.php"); if(!isset($_SESSION['username']) && !isset($_SESSION['password'])){ header("Location: pubs.php"); } $getmember = mysql_query("SELECT * FROM `publishers` WHERE username = '".$_SESSION['username']."'"); $info = mysql_fetch_array($getmember); $getsites = mysql_query("SELECT * FROM `websites` WHERE publisher = '".$info['username']."'"); $postback = $_POST['website']; $webname = $_POST['webid']; if($_POST['submit']){ var_dump( $_POST['website'] ); $update = mysql_query("UPDATE `websites` SET `postback` = '$postback' WHERE name = '$webname'"); } print" <div id='center'> <span id='tools_lander'><a href='export.php'>Export Campaigns</a></span> <div id='calendar_holder'> <h3>Please define a postback for each of your websites below. The following variables should be used when creating your postback.<br /> cid = Campaign ID<br /> sid = Sub ID<br /> rate = Campaign Rate<br /> status = Status of Lead. 1 means payable 2 mean reversed<br /> A sample postback URL would be <br /> http://www.example.com/postback.php?cid=#cid&sid=#sid&rate=#rate&status=#status</h3> <table class='balances' align='center'> <form method='POST' action=''>"; while($website = mysql_fetch_array($getsites)){ print" <tr> <input type ='hidden' name='webid' value='".$website['id']."' /> <td style='font-weight:bold;'>".$website['name']."'s Postback:</td> <td><input type='text' style='width:400px;' name='website[]' value='".$website['postback']."' /></td> </tr>"; } print" <td style='float:right;position:relative;left:150px;'><input type='submit' name='submit' style='font-size:15px;height:30px;width:100px;' value='Submit' /></td> </form> </table> </div>"; include("footer.php"); ?> What I am attempting to do insert the what is inputted in the text boxes to their corresponding websites, and I cannot think of any other way to do it, and this obviously does not works and returns a notice stating Array to string conversion If there is a more logical way to do this please let me know.

    Read the article

  • Tomcat log include servlet context

    - by Kris
    I have a Tomcat instance running several websites. Recently I've been trying to deal with the various error messages that wind up in the Tomcat log file (catalina.out). None of the issues are affecting the websites, but all the noise is making it difficult to see actual problems. My problem is that frequently the message is being emitted by a library that is used by multiple webapps. Unless a stacktrace is included (which it often isn't) I can't tell which webapp is responsible without a lot of digging. So the question is, can I somehow configure Tomcat to include the servlet context in the log file? Or perhaps have different log files per context?

    Read the article

  • Set up simple reverse proxy using IIS

    - by Ropstah
    I would like to reverse proxy my Jira installation on a Windows server 2008 machine. Jira is running under: http://jira.domain.com:8080/ and is accessible as such. The machine also runs IIS for hosting several ASP.NET websites. I followed instructions here: http://blogs.iis.net/carlosag/archive/2010/04/01/setting-up-a-reverse-proxy-using-iis-url-rewrite-and-arr.aspx and installed URL rewrite and ARR. I now have a “Web farm” node in my IIS instance but I’ve got no idea on how to proceed. I tried adding some rules but this made the rest of my IIS websites stop responding. Is there a simple way to say: 1. Forward http://jira.domain.com to http://localhost:8080 2. Ignore other domains and route them as usual Any help is greatly appreciated!

    Read the article

  • Connection to website interrupted/reset. Why?

    - by Goje87
    When navigating to some websites (like victorsosea.com, citibank.co.in etc.) on Google Chrome, I get a message that the 'connection to www.site.com was interrupted'. If I try on Firefox, it says 'The connection to the server was reset'. This happens no matter what internet connection I use on my laptop. Be it my home internet or office internet. I have tried using Google DNS address as well but that does not seem to solve the problem. I have been facing this problem randomly for some websites. Kindly let me know how I can resolve the issue. My laptop configuration: Windows 7, 64-bit

    Read the article

  • IIS cannot access itself

    - by dave
    We are on a corporate network that uses ISA and I am having issues trying to not have requests go through ISA. I have IIS7 on my local Windows 7 machine that has websites and a service layer. The websites access the service layer using a xxxx.servicelayer.local address that is set up in my HOSTS file to point to 127.0.0.1. I have Windows Firewall client which I have disabled. I have tried both adding this address into IE so that it does not go through ISA and also disabled this section altogether. When the website (which is actually IIS making the request to itself) tries to access the service layer I receive an ISA error that proxy authentication has failed. Considering that everything I can see to configure is set to not go through the proxy, ISA, I cannot see how this is actually going through the proxy and giving this error. Is there something within Windows 7 that forces the proxy setting, some sort of caching or similar?

    Read the article

  • Apache mpm-itk Performance

    - by Matt Beckman
    I manage a bunch of VPSs with memory ranging from 1GB to 8GB. Most of these websites are Joomla websites, and the servers must support multiple sites/users/S-FTP. I use mpm-itk almost exclusively (mostly due to it's convenience in these shared environments). However, I'm aware it isn't known for performance, so I need some advice on making it faster. Due to the lack of documentation when I first went the way of mpm-itk, I included only one setting in the config, and that was to limit each user to 50 clients (the rest I left up to defaults): <IfModule mpm_itk_module> MaxClientsVHost 50 </IfModule> Are there any better alternatives available? Are there any settings supported in mpm-prefork or mpm-worker that are also supported in mpm-itk? Thanks!

    Read the article

  • PHP matching a string

    - by John Jones
    Hi, I have an Indian company data set and need to extract the City and Zip from the address field: Address Field Example: Gowripuram West, Sengunthapuram Post, Near L.G.B., Karur, Tamilnadu, Karur - 639 002, India As you can see the City is Karur and the zip is followed after the - (hyphen). I need the PHP code to match [city] - [zip] Not sure how to do this I can find the Zip after the Hypen but not sure how to find the City, please note the City can be 2 words. Cheers for your time./ J

    Read the article

  • Software to measure the speed of http browser connections

    - by Dan Revell
    I'm trying to gather some ammunition about my ISP and it's aggressive traffic shaping of particular websites. So I'm looking for an application that will allow me to see what speeds I get to particular websites. I want to open a browser connection to a particular video streaming website for example and see the speed of the connection that gets made, be it http or plain tcp. I'm after something along the lines of TCPView but unfortunately this doesn't include transfer speeds. I'm half tempted to write one myself but hopefully something will already be out there that does the job.

    Read the article

  • Subversion for web designer: repository on a network share and ftp to the live server?

    - by ceatus
    My configuration: htdocs on a windows network share (z:) web developers check out with dreamweaver modify and check in back to the drive z LAMP running on a Ubuntu server virtualized on Hyper-V with apache that point on the z drive for dev in order to test the websites Upload by FTP on the live server Now: I need multiple access to the repository, keep them on a network shares and we manage about 200 websites. All the web developers, administrators and IT need to access to the share. I found out that creating a svn server is the best way for me, so I created it on a Ubuntu Server which is virtualized on Hyper-V. Right now I have the repos local on the Ubuntu Server but I'd like them on my network drive and I'd like to have a post-commit, if possible, in order to ftp directly on my live server. Do you guys think that a WebDav solution would be better? Thanks in advance Angelo

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >