Search Results

Search found 18450 results on 738 pages for 'website attacks'.

Page 15/738 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it can

    - by Chris Altman
    This is a theoretical problem I am curious about. Websites are hosted on servers. Servers can be taken offline. Is it possible to host a website in the 'ether' of the Internet -- not on a server -- so that it cannot be taken down? One example, is that the website is hosted on other websites, like a parasite. Another is that it is assembled through storing pieces on DNS machines, routers, etc., so that it get assembled on the fly. The purpose is that this website could live forever because no one person can remove it. The answers I am looking for are plausible idea/approaches on technically how this could be built.

    Read the article

  • Should I have seperate business and personal websites?

    - by Thomas Clowes
    I have my business website - I am a web designer and developer, and also buy/sell websites/domain names. As such my website links to 'Our sites' = the websites which we design and run as well as a variety of tools such as a domain whois tool. These are obviously relevant to the business. As an individual, I like to travel and do white water kayaking as a hobby. I also have a degree in economics. I have thus created a blog on my business website where I write about domain names, web design, kayaking, travelling and economics. I've just begun researching SEO and am looking into optimizing my business website. I don't actually directly offer any services to clients at the moment, my main aim is to have a business website which supports my websites. If for example a potential advertise on one of my sites checks out the business website, I want them to think professional, down to earth, quirky. Given this is having my business/personal interests intertwined a problem? For SEO.. on my homepage for example when I'm writing a headline and a paragraph about what we do.. what do I put? and how do I optimize for SEO with keywords and the like? Further to the above, my company sponsors me and a group of accquantances as a kayaking team.. as such my personal interests do sort of overlap (just to add a complexity :))

    Read the article

  • Collision Attacks, Message Digests and a Possible solution

    - by Dominar
    I've been doing some preliminary research in the area of message digests. Specifically collision attacks of cryptographic hash functions such as MD5 and SHA-1, such as the Postscript example and X.509 certificate duplicate. From what I can tell in the case of the postscript attack, specific data was generated and embedded within the header of the postscript (which is ignored during rendering) which brought about the internal state of the md5 to a state such that the modified wording of the document would lead to a final MD equivalent to the original. The X.509 took a similar approach where by data was injected within the comment/whitespace of the certificate. Ok so here is my question, and I can't seem to find anyone asking this question: Why isn't the length of ONLY the data being consumed added as a final block to the MD calculation? In the case of X.509 - Why is the whitespace and comments being taken into account as part of the MD? Wouldn't a simple processes such as one of the following be enough to resolve the proposed collision attacks: MD(M + |M|) = xyz MD(M + |M| + |M| * magicseed_0 +...+ |M| * magicseed_n) = xyz where : M : is the message |M| : size of the message MD : is the message digest function (eg: md5, sha, whirlpool etc) xyz : is the acutal message digest value for the message M magicseed_{i}: Is a set random values generated with seed based on the internal-state prior to the size being added. This technqiue should work, as to date all such collision attacks rely on adding more data to the original message. In short, the level of difficulty involved in generating a collision message such that: It not only generates the same MD But is also comprehensible/parsible/compliant and is also the same size as the original message, is immensely difficult if not near impossible. Has this approach ever been discussed? Any links to papers etc would be nice.

    Read the article

  • How to Own Your Own Website (Even If You Can’t Build One) Pt 2

    - by Eric Z Goodnight
    Last week we talked about how to buy and start a simple website using WordPress. Today, we’ll start customizing our WordPress site and get you off on the right foot to having a great quality, feature rich website. We’ll take a quick walk through the menus of WordPress and help to make it easier on a first time user, as well as showing you how to start your new site off with a theme and an easily updatable, customized navigation. It can be intimidating to start a new WordPress site, but stick with us—part two of “How to Own Your Own Website” is coming right up. How to Own Your Own Website (Even If You Can’t Build One) Pt 2 How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows?

    Read the article

  • Where To Begin To Make A Website

    - by lolyoshi
    I'm a newbie in web programming. I haven't done anything that relates to website before. Now, my new task is creating a website using Java, Jsp, HTML, CSS, mySQL, Apache and Spring Framework (MVC model). I want to know what I should research if I want my website has the function as post entries, comment entries, delete entries, edit entries, etc as a forum? Which I need to know beside above things? I don't know how to update my website automatically when there're changes in website as the top view products, the best products. I don't think I'll input or change them manually. So, which tools or language can support that? Thank for advance

    Read the article

  • My website dns_server_failure when using University Connection

    - by iMohammad
    My website used to open just fine in the past when I use my University connection, but now since I transfered my website to another hosting company this problem started to appear. Some times the website open and sometimes I get this alot!: Network Error (dns_server_failure) Your request could not be processed because an error occurred contacting the DNS server. The DNS server may be temporarily unavailable, or there could be a network problem. For assistance, contact your network support team. Do you have any idea? I checked using websites checking tools and my website was running just fine on any other connection , ADSL, 3g but except my University connection. Thanks in advance :) UPDATE: When I open my website using the Real IP Server it does open just fine. But with my domain nope. Also, even other websites that are hosted on my server cannot be opened too

    Read the article

  • Visitors have old website cached in their browsers

    - by RussianBlue
    My client's new website is example.com, the old website is example.co.uk. I've re-pointed the A Records to the new website (so as to leave the emails alone) and put in 301 redirects from old pages to new pages. But, my client is upset as he (and he thinks many of his clients) have the old website cached in their browsers and won't know how to clear their browser cache. Is there anything I can do to overcome this and if not, what sort of time will browsers finally stop using their cached pages so I can at least go back to my client and tell him that his clients will finally start to see the new website?

    Read the article

  • ASP.NET Website Security Tips and Tricks

    This is a tutorial on how to secure your ASP.NET Mono website. Securing an ASP.NET website that runs in Mono is very different from securing an ASP.NET website hosted in a Windows environment because an ASP.NET Mono website runs on a non-Windows server such as Apache or Lighttpd and on an operating system such as Linux Unix. Thus the principles of securing a website in Apache server can be applied to securing an ASP.NET that runs in Mono.... Comcast? Business Class - Official Site Learn About Comcast Small Business Services. Best in Phone, TV & Internet.

    Read the article

  • E-Commerce Website

    - by haargott
    I am planning to create an e-commerce website for users to buy products and services. In this website I want users to register and also participate in something like a browser game, where every user may receive some questions which they have to answer. For each question they successfully answered, they receive points and the number of collected points will decide on which rank they are. Edit 2 Currently I am considering using only HTML, CSS, JavaScript, PHP, SQL to design this e-commerce website. Together with this I was thinking about learning jQuery as it may help me, but I am not sure if I should code everything specifically by myself or just use the library to make it faster. 1) Could you tell me if those languages are sufficient enough for creating such a website described? 2) Could you tell me what kind of free software tools and frameworks are most appropriate to use when creating this e-commerce website?

    Read the article

  • How to transfer a website hosted online to my Virtual Host

    - by user831740
    I'm doing some work on a webpage, meaning I gotta modify a few things. As a result of this I installed apache and all the things that come associated with it, and also installed Joomla, and I got everything running well. My problem is, I need to make apache run this website locally, so I downloaded the public_html folder from the FTP server on my website, but I have absolutely no idea how to implement the website in order to make it run on apache. I've read a few guides, but they all tell me how to create a new website instead of helping me host a website already done.

    Read the article

  • Website Ethics / legal issues, image copyrights

    - by RailsN00b
    Ignoring the technical implementation of a website for a second, assume a website that is similar to twitter but with pictures. A user say something and puts a picture of whatever they said. As the nature of the internet, the images will most likely not be his/hers image. There are 2 options that I see for dealing with this: 1. The user will post a URL of the picture and the website will pull the picture from that URL everytime someone enters that page 2. The website will save the image in its own database of images and display the image to the visitors 'locally' The problem with option #1, while it saved storage, I see an issue with 'stealing' other websites bandwidth and if my website has many many visitors it could cost the image-hosting websites a lot and possible even crash it if the server can't handle the load. The problem with opion #2, while it saves the load to other websites, it practically takes pictures that could have copyright on them. Which option is better to implement in terms of legal issues and ethics? When do I need to contact another website to request permission to use the images from that site? Does anyone really care about that anymore. Where can I read about this?

    Read the article

  • Simple website with a GPL V3 Framework

    - by sineverba
    I write web-based software and simple website ("Home", "Who we are", "Contact"). For a simple website I'm using a covered GPL v3 framework. The user surf the website, send an email, take info, etc. I repeat: simple website, not a Joomla or Wordpress. 1) Will the website be covered with the GPL? I don't modify the framework. I'm using his classes in other classes... (OOP). 2) For the point 1, if yes, do I need to add (e.g. in the footer) name of framework and his link? 3) I must permit download of entire website to study code (nothing that a programmer has interest in)? E.g. placing it in Github? 4) If 2 is NO, how you can "understand" that we use that framework? In effect no php lines are exposed to the browser... You cannot understand that when you push "Send email" the site is calling $this->send($email). If you write me an email "Are you using XXX framework"? I can answer NO.

    Read the article

  • 10 Essential Tools for building ASP.NET Websites

    - by Stephen Walther
    I recently put together a simple public website created with ASP.NET for my company at Superexpert.com. I was surprised by the number of free tools that I ended up using to put together the website. Therefore, I thought it would be interesting to create a list of essential tools for building ASP.NET websites. These tools work equally well with both ASP.NET Web Forms and ASP.NET MVC. Performance Tools After reading Steve Souders two (very excellent) books on front-end website performance High Performance Web Sites and Even Faster Web Sites, I have been super sensitive to front-end website performance. According to Souders’ Performance Golden Rule: “Optimize front-end performance first, that's where 80% or more of the end-user response time is spent” You can use the tools below to reduce the size of the images, JavaScript files, and CSS files used by an ASP.NET application. 1. Sprite and Image Optimization Framework CSS sprites were first described in an article written for A List Apart entitled CSS sprites: Image Slicing’s Kiss of Death. When you use sprites, you combine multiple images used by a website into a single image. Next, you use CSS trickery to display particular sub-images from the combined image in a webpage. The primary advantage of sprites is that they reduce the number of requests required to display a webpage. Requesting a single large image is faster than requesting multiple small images. In general, the more resources – images, JavaScript files, CSS files – that must be moved across the wire, the slower your website. However, most people avoid using sprites because they require a lot of work. You need to combine all of the images and write just the right CSS rules to display the sub-images. The Microsoft Sprite and Image Optimization Framework enables you to avoid all of this work. The framework combines the images for you automatically. Furthermore, the framework includes an ASP.NET Web Forms control and an ASP.NET MVC helper that makes it easy to display the sub-images. You can download the Sprite and Image Optimization Framework from CodePlex at http://aspnet.codeplex.com/releases/view/50869. The Sprite and Image Optimization Framework was written by Morgan McClean who worked in the office next to mine at Microsoft. Morgan was a scary smart Intern from Canada and we discussed the Framework while he was building it (I was really excited to learn that he was working on it). Morgan added some great advanced features to this framework. For example, the Sprite and Image Optimization Framework supports something called image inlining. When you use image inlining, the actual image is stored in the CSS file. Here’s an example of what image inlining looks like: .Home_StephenWalther_small-jpg { width:75px; height:100px; background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEsAAABkCAIAAABB1lpeAAAAB GdBTUEAALGOfPtRkwAAACBjSFJNAACHDwAAjA8AAP1SAACBQAAAfXkAAOmLAAA85QAAGcxzPIV3AAAKL s+zNfREAAAAASUVORK5CYII=) no-repeat 0% 0%; } The actual image (in this case a picture of me that is displayed on the home page of the Superexpert.com website) is stored in the CSS file. If you visit the Superexpert.com website then very few separate images are downloaded. For example, all of the images with a red border in the screenshot below take advantage of CSS sprites: Unfortunately, there are some significant Gotchas that you need to be aware of when using the Sprite and Image Optimization Framework. There are workarounds for these Gotchas. I plan to write about these Gotchas and workarounds in a future blog entry. 2. Microsoft Ajax Minifier Whenever possible you should combine, minify, compress, and cache with a far future header all of your JavaScript and CSS files. The Microsoft Ajax Minifier makes it easy to minify JavaScript and CSS files. Don’t confuse minification and compression. You need to do both. According to Souders, you can reduce the size of a JavaScript file by an additional 20% (on average) by minifying a JavaScript file after you compress the file. When you minify a JavaScript or CSS file, you use various tricks to reduce the size of the file before you compress the file. For example, you can minify a JavaScript file by replacing long JavaScript variables names with short variables names and removing unnecessary white space and comments. You can minify a CSS file by doing such things as replacing long color names such as #ffffff with shorter equivalents such as #fff. The Microsoft Ajax Minifier was created by Microsoft employee Ron Logan. Internally, this tool was being used by several large Microsoft websites. We also used the tool heavily on the ASP.NET team. I convinced Ron to publish the tool on CodePlex so that everyone in the world could take advantage of it. You can download the tool from the ASP.NET Ajax website and read documentation for the tool here. I created the installer for the Microsoft Ajax Minifier. When creating the installer, I also created a Visual Studio build task to make it easy to minify all of your JavaScript and CSS files whenever you do a build within Visual Studio automatically. Read the Ajax Minifier Quick Start to learn how to configure the build task. 3. ySlow The ySlow tool is a free add-on for Firefox created by Yahoo that enables you to test the front-end of your website. For example, here are the current test results for the Superexpert.com website: The Superexpert.com website has an overall score of B (not perfect but not bad). The ySlow tool is not perfect. For example, the Superexpert.com website received a failing grade of F for not using a Content Delivery Network even though the website using the Microsoft Ajax Content Delivery Network for JavaScript files such as jQuery. Uptime After publishing a website live to the world, you want to ensure that the website does not encounter any issues and that it stays live. I use the following tools to monitor the Superexpert.com website now that it is live. 4. ELMAH ELMAH stands for Error Logging Modules and Handlers for ASP.NET. ELMAH enables you to record any errors that happen at your website so you can review them in the future. You can download ELMAH for free from the ELMAH project website. ELMAH works great with both ASP.NET Web Forms and ASP.NET MVC. You can configure ELMAH to store errors in a number of different stores including XML files, the Event Log, an Access database, a SQL database, an Oracle database, or in computer RAM. You also can configure ELMAH to email error messages to you when they happen. By default, you can access ELMAH by requesting the elmah.axd page from a website with ELMAH installed. Here’s what the elmah page looks like from the Superexpert.com website (this page is password-protected because secret information can be revealed in an error message): If you click on a particular error message, you can view the original Yellow Screen ASP.NET error message (even when the error message was never displayed to the actual user). I installed ELMAH by taking advantage of the new package manager for ASP.NET named NuGet (originally named NuPack). You can read the details about NuGet in the following blog entry by Scott Guthrie. You can download NuGet from CodePlex. 5. Pingdom I use Pingdom to verify that the Superexpert.com website is always up. You can sign up for Pingdom by visiting Pingdom.com. You can use Pingdom to monitor a single website for free. At the Pingdom website, you configure the frequency that your website gets pinged. I verify that the Superexpert.com website is up every 5 minutes. I have the Pingdom service verify that it can retrieve the string “Contact Us” from the website homepage. If your website goes down, you can configure Pingdom so that it sends an email, Twitter, SMS, or iPhone alert. I use the Pingdom iPhone app which looks like this: 6. Host Tracker If your website does go down then you need some way of determining whether it is a problem with your local network or if your website is down for everyone. I use a website named Host-Tracker.com to check how badly a website is down. Here’s what the Host-Tracker website displays for the Superexpert.com website when the website can be successfully pinged from everywhere in the world: Notice that Host-Tracker pinged the Superexpert.com website from 68 locations including Roubaix, France and Scranton, PA. Debugging I mean debugging in the broadest possible sense. I use the following tools when building a website to verify that I have not made a mistake. 7. HTML Spell Checker Why doesn’t Visual Studio have a built-in spell checker? Don’t know – I’ve always found this mysterious. Fortunately, however, a former member of the ASP.NET team wrote a free spell checker that you can use with your ASP.NET pages. I find a spell checker indispensible. It is easy to delude yourself that you are capable of perfect spelling. I’m always super embarrassed when I actually run the spell checking tool and discover all of my spelling mistakes. The fastest way to add the HTML Spell Checker extension to Visual Studio is to select the menu option Tools, Extension Manager within Visual Studio. Click on Online Gallery and search for HTML Spell Checker: 8. IIS SEO Toolkit If people cannot find your website through Google then you should not even bother to create it. Microsoft has a great extension for IIS named the IIS Search Engine Optimization Toolkit that you can use to identify issue with your website that would hurt its page rank. You also can use this tool to quickly create a sitemap for your website that you can submit to Google or Bing. You can even generate the sitemap for an ASP.NET MVC website. Here’s what the report overview for the Superexpert.com website looks like: Notice that the Sueprexpert.com website had plenty of violations. For example, there are 65 cases in which a page has a broken hyperlink. You can drill into these violations to identity the exact page and location where these violations occur. 9. LinqPad If your ASP.NET website accesses a database then you should be using LINQ to Entities with the Entity Framework. Using LINQ involves some magic. LINQ queries written in C# get converted into SQL queries for you. If you are not careful about how you write your LINQ queries, you could unintentionally build a really badly performing website. LinqPad is a free tool that enables you to experiment with your LINQ queries. It even works with Microsoft SQL CE 4 and Azure. You can use LinqPad to execute a LINQ to Entities query and see the results. You also can use it to see the resulting SQL that gets executed against the database: 10. .NET Reflector I use .NET Reflector daily. The .NET Reflector tool enables you to take any assembly and disassemble the assembly into C# or VB.NET code. You can use .NET Reflector to see the “Source Code” of an assembly even when you do not have the actual source code. You can download a free version of .NET Reflector from the Redgate website. I use .NET Reflector primarily to help me understand what code is doing internally. For example, I used .NET Reflector with the Sprite and Image Optimization Framework to better understand how the MVC Image helper works. Here’s part of the disassembled code from the Image helper class: Summary In this blog entry, I’ve discussed several of the tools that I used to create the Superexpert.com website. These are tools that I use to improve the performance, improve the SEO, verify the uptime, or debug the Superexpert.com website. All of the tools discussed in this blog entry are free. Furthermore, all of these tools work with both ASP.NET Web Forms and ASP.NET MVC. Let me know if there are any tools that you use daily when building ASP.NET websites.

    Read the article

  • Making nginx withstand flood attacks

    - by Tiffany Walker
    How can I make it stand stand against attacks better? Are their plugins. Looking for a way to RATE LIMIT and remain up and not slow down. My Setup: user nobody; # no need for more workers in the proxy mode worker_processes 4; worker_cpu_affinity 0001 0010 0100 1000; worker_priority -2; error_log /var/log/nginx/error.log info; worker_rlimit_nofile 40480; events { worker_connections 5120; # increase for busier servers use epoll; # you should use epoll here for Linux kernels 2.6.x } http { server_name_in_redirect off; server_names_hash_max_size 10240; server_names_hash_bucket_size 1024; include mime.types; default_type application/octet-stream; server_tokens off; disable_symlinks if_not_owner; sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 5; gzip on; gzip_vary on; gzip_disable "MSIE [1-6]\."; gzip_proxied any; gzip_http_version 1.1; gzip_min_length 1000; gzip_comp_level 9; gzip_buffers 16 8k; # You can remove image/png image/x-icon image/gif image/jpeg if you have slow CPU gzip_types text/plain text/xml text/css application/x-javascript application/xml image/png image/x-icon image/gif image/jpeg application/xml+rss text/javascript application/atom+xml; ignore_invalid_headers on; client_header_timeout 3m; client_body_timeout 3m; send_timeout 3m; reset_timedout_connection on; connection_pool_size 256; client_header_buffer_size 256k; large_client_header_buffers 4 256k; client_max_body_size 200M; client_body_buffer_size 128k; request_pool_size 32k; output_buffers 4 32k; postpone_output 1460; proxy_temp_path /tmp/nginx_proxy/; client_body_in_file_only on; log_format bytes_log "$msec $bytes_sent ."; include "/etc/nginx/vhosts/*"; } vhost file: server { error_log /var/log/nginx/vhost-error_log warn; listen 194.145.208.19:80; server_name ipxnow.in www.ipxnow.in; access_log /usr/local/apache/domlogs/ipxnow.in-bytes_log bytes_log; access_log /usr/local/apache/domlogs/ipxnow.in combined; root /home/ipxnowin/public_html; location / { location ~.*\.(3gp|gif|jpg|jpeg|png|ico|wmv|avi|asf|asx|mpg|mpeg|mp4|pls|mp3|mid|wav|swf|flv|html|htm|txt|js|css|exe|zip|tar|rar|gz|tgz|bz2|uha|7z|doc|docx|xls|xlsx|pdf|iso)$ { expires 7d; try_files $uri @backend; } error_page 405 = @backend; add_header X-Cache "HIT from Backend"; proxy_pass http://194.145.208.19:8081; include proxy.inc; } location @backend { internal; proxy_pass http://194.145.208.19:8081; include proxy.inc; } location ~ .*\.(php|jsp|cgi|pl|py)?$ { proxy_pass http://194.145.208.19:8081; include proxy.inc; } location ~ /\.ht { deny all; } } and proxy.inc: proxy_connect_timeout 59s; proxy_send_timeout 600; proxy_read_timeout 600; proxy_buffer_size 64k; proxy_buffers 16 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; proxy_pass_header Set-Cookie; proxy_redirect off; proxy_hide_header Vary; proxy_set_header Accept-Encoding ''; proxy_ignore_headers Cache-Control Expires; proxy_set_header Referer $http_referer; proxy_set_header Host $host; proxy_set_header Cookie $http_cookie; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Server $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

    Read the article

  • Why Does My Website Redirect me to my localhost?

    - by Noah Brainey
    Alight, my website has some issues that I'm not sure what's causing them. Visit this page http://online-file-sharing.net/tos.html and click one of the bottom footer links... it redirects you to your localhost in the address bar. I have no idea why it does this. I'm hosting this website on my own server, which is this computer, and using Xampp. If this information helps. Anyways any help would be greatly appreciated! I'm also using DYNDNS as my nameservers.

    Read the article

  • Where is the best location to keep shared-developer website files in the linux hierarchy?

    - by Tchalvak
    I just started hosting files for a website on my server, and I'm not sure where is an appropriate place to keep them. At the moment, I have them in /var/www/name.of.virtualhost.site/www/. That's obviously not secure because anything below the final public /www/ folder is also available since the /var/www/ contents are already being served up. For example, /var/www/name.of.virtualhost.site/docs/site_policies.txt is accessible via something like defaultsite.com/name.of.virtualhost.site/docs/site_policies.txt. So where is a good place to store the files that make up a website? (when it's a site that only I'm developing, I can obviously just stick them in /home/my_username/sites/name.of.virtualhost.site/, but that doesn't work well when I want other developers to be working on the site's files as well) I'm running a LAMP stack, not that I expect it to matter.

    Read the article

  • How can I get a virus by just visiting a website?

    - by Janet Jacobs
    It is common knowledge that you can get a virus just by visiting a website. But how is this possible? Do these viruses attack Windows, Mac and Linux users, or are Mac/Linux users immune? I understand that I obviously can get a virus by downloading and executing a .exe in Windows but how can I get a virus just by accessing a website? Are the viruses programmed in JavaScript? (It would make sense since it is a programming language that runs locally.) If so, what JavaScript functions are the ones commonly used?

    Read the article

  • How to read a Website's Directory Structure using WMI and C# in IIS 6.0?

    - by Steve Johnson
    Hi all, I need to read a website's folders using WMI and C# in IIS 6.0. I am able to read the Virtual directories and applications using the "IISWebVirtualDirSetting" class. However the physical folders located inside a website cannot be read using this class. And for my case i need to read sub folders located within a website and later on set permission on them. For my requirement i dont need to work on Virtual Directories/Web Service Applications (which can be easily obtained using the code below..). I have tried to use IISWebDirectory class but it has been useful. Here is the code that reads IIS Virtual Directories... public static ArrayList RetrieveVirtualDirList(String ServerName, String WebsiteName) { ConnectionOptions options = SetUpAuthorization(); ManagementScope scope = new ManagementScope(string.Format(@"\\{0}\root\MicrosoftIISV2", ServerName), options); scope.Connect(); String SiteId = GetSiteIDFromSiteName(ServerName, WebsiteName); ObjectQuery OQuery = new ObjectQuery(@"SELECT * FROM IISWebVirtualDirSetting"); //ObjectQuery OQuery = new ObjectQuery(@"SELECT * FROM IIsSetting"); ManagementObjectSearcher WebSiteFinder = new ManagementObjectSearcher(scope, OQuery); ArrayList WebSiteListArray = new ArrayList(); ManagementObjectCollection WebSitesCollection = WebSiteFinder.Get(); String WebSiteName = String.Empty; foreach (ManagementObject WebSite in WebSitesCollection) { WebSiteName = WebSite.Properties["Name"].Value.ToString(); WebsiteName = WebSiteName.Replace("W3SVC/", ""); String extrctedSiteId = WebsiteName.Substring(0, WebsiteName.IndexOf('/')); String temp = WebsiteName.Substring(0, WebsiteName.IndexOf('/') + 1); String VirtualDirName = WebsiteName.Substring(temp.Length); WebsiteName = WebsiteName.Replace(SiteId, ""); if (extrctedSiteId.Equals(SiteId)) //if (true) { WebSiteListArray.Add(VirtualDirName ); //WebSiteListArray.Add(WebSiteName); //+ "|" + WebSite.Properties["Path"].Value.ToString() } } return WebSiteListArray; } Kindly help in this regard. Thanks you.

    Read the article

  • How to read an IIS 6 Website's Directory Structure using WMI?

    - by Steve Johnson
    I need to read a website's folders using WMI and C# in IIS 6.0. I am able to read the Virtual directories and applications using the "IISWebVirtualDirSetting" class. However the physical folders located inside a website cannot be read using this class. And for my case i need to read sub folders located within a website and later on set permission on them. For my requirement i dont need to work on Virtual Directories/Web Service Applications (which can be easily obtained using the code below..). I have tried to use IISWebDirectory class but it has been useful. Here is the code that reads IIS Virtual Directories... public static ArrayList RetrieveVirtualDirList(String ServerName, String WebsiteName) { ConnectionOptions options = SetUpAuthorization(); ManagementScope scope = new ManagementScope(string.Format(@"\\{0}\root\MicrosoftIISV2", ServerName), options); scope.Connect(); String SiteId = GetSiteIDFromSiteName(ServerName, WebsiteName); ObjectQuery OQuery = new ObjectQuery(@"SELECT * FROM IISWebVirtualDirSetting"); //ObjectQuery OQuery = new ObjectQuery(@"SELECT * FROM IIsSetting"); ManagementObjectSearcher WebSiteFinder = new ManagementObjectSearcher(scope, OQuery); ArrayList WebSiteListArray = new ArrayList(); ManagementObjectCollection WebSitesCollection = WebSiteFinder.Get(); String WebSiteName = String.Empty; foreach (ManagementObject WebSite in WebSitesCollection) { WebSiteName = WebSite.Properties["Name"].Value.ToString(); WebsiteName = WebSiteName.Replace("W3SVC/", ""); String extrctedSiteId = WebsiteName.Substring(0, WebsiteName.IndexOf('/')); String temp = WebsiteName.Substring(0, WebsiteName.IndexOf('/') + 1); String VirtualDirName = WebsiteName.Substring(temp.Length); WebsiteName = WebsiteName.Replace(SiteId, ""); if (extrctedSiteId.Equals(SiteId)) //if (true) { WebSiteListArray.Add(VirtualDirName ); //WebSiteListArray.Add(WebSiteName); //+ "|" + WebSite.Properties["Path"].Value.ToString() } } return WebSiteListArray; } P.S: I need to programmatically get the sub folders of an already deployed site(s) using WMI and C# in an ASP. Net Application. I need to find out the sub folders of existing websites in a local or remote IIS 6.0 Web Server. So i require a programmatic solution. Precisely if i am pointed at the right class (like IISWebVirtualDirSetting etc ) that i may use for retrieving the list of physical folders within a website then it will be quite helpful. I am not working in Powershell and i don't really need a solution that involves powershell or vbscripts. Any alternative programmatic way of doing the same in C#/ASP.Net will also be highly appreciated.

    Read the article

  • What would cause an IIS6 website to be unavailable remotely randomly for a few minutes at a time?

    - by jskunkle
    Website is served by iis6 on windows server 2003. Never saw this problem once for months in beta. We made the new site live yesterday - its getting more traffic than in beta but not that much - resource utilization on the server and speed are fine. Today the site has been unavailable remotely a few (4?) times for a few minutes at a time. If you visit any page on the site - nothing is ever returned and eventually the request times out. While this is happening - I can connect to the server via remote desktop and the site loads fine from the live url when running a browser on the server locally. Other websites on the server continiue to function fine the entire time (using the same instance of iis, different app pools). Other computers on the same network can't access the website either. Other than not serving content - the server seems to behave normally - scheduled jobs in our custom job system continue to run, etc. We've looked at the iis logs quickly and we don't see any traffic out of the ordinary - no traffic spikes, etc. Any ideas? Thanks, Shane

    Read the article

  • What are the "least legally restrictive" well-connected countries to host a website?

    - by monster
    NB: I am aware that this question is subjective, as it can't be defined precisely, but the answers should still be "objective": Country name, and what makes it legally safer. EDIT: A) I am located in Germany. B) I am NOT looking for a place to offer pirated Software/Media; no binary on my site, except "profile icon". Hello! I want to start publishing "social" websites / apps, and I found that the biggest initial problem is this: Any and all services I have to depend on, including Domain Registrar, DNS provider, Server/Cloud Provider, CDN Provider, ... even my Insurance Agent, basically say that they can "throw me out" if my website contains "unacceptable" content. It's always phrased in such a way that basically anything can fall under "unacceptable" content. This is very frustrating because you just can't fully control what users post on your "social website", and you so you basically have to expect when you go to bed that your site is going to be gone when you wake up. I've heard a lot of horror stories about this. Since the "Terms Of Service" of all those providers are foremost to protect themselves from legal actions, and those legal actions depend on the country where they are located, it seems like the first step is to find which country is the "safest" to locate a site. "Safest" being defined as, where I am least likely to get in legal trouble with the local authorities, if some user posts something unacceptable in some way. The main restriction is that it should also be a "well-connected" country, because there is no point in being "safe", if my users can't get to my sites, or the latency is unacceptable. I am targeting the English speaking people in any country as my future users.

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >