Search Results

Search found 20795 results on 832 pages for 'personal website'.

Page 84/832 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • serving files via subdomain

    - by muntasir
    does serving files like images via subdomain speed up loading speed. i read somewhere that static files like images, java script served via cookieless domain does mean something regarding website speed.

    Read the article

  • Websites that introducing new stuff

    - by user33929
    Is there any blog/website that introducing new funny/fancy websites or new tech (software/hardware)? I am talking about the site that may introduce digg.com, superuser.com, mint.com or delicious.com when these site first come out. Also, introducing new handy/useful 3rd party applications/tools. Thanks.

    Read the article

  • Why doesn't my web page load completely every time?

    - by Gayanee Wijayasekara
    I have an intranet website running on IIS 7. When I try to load my site, it reacts differently every time. Here are the following different scenarios that occur when I try to load my site: The site loads right away and is working properly The site loads slowly and some my styling/images/javascript did not appear to load correctly. I receive a "503 service unavailable" error Any ideas why this is happening?

    Read the article

  • DVD (vob file) to online video viewer?

    - by Nick
    I've been sent a DVD which needs to be put onto a website, but I honestly don't even know where to start. Do I simply convert the file using some software to MP4(?!) and then use something like http://videojs.com/ to view it online? I'm really sorry for the vague question, but I want to produce the best quality results, with good compression, good quality and a nice video player interface. Would really appreciate any recommendations. Thank you!

    Read the article

  • Outlook 2007 Backup to D:\Outlook Fails - Access Denied, Write-Protected or File In Use

    - by nicorellius
    I can successfully save the Outlook PST file to the default location on the C drive (C:\Documents and Settings\user\ ... \Outlook) but when I change the backup save to directory to Outlook on the D drive I get the error: Cannot copy Outlook: Access is denied. Make sure the disk is not full or write protected and that the file is not currently in use. I suppose it is not that crucial that I save this file here, but I have never seen this problem before and I have made this same change in the past. I did some searching in this knowledge exchange as well as elsewhere on changing permissions, etc, but this didn't help. I discovered that the folder on my D drive (called Outlook) is not write-protected and nor is it read-only, as I can save to and modify files in that directory, as well as rename and delete the directory itself. At the time when I installed this version of Outlook, I used a previously saved Personal Folder (a backup PST file) and I thought having this still open in Outlook was causing the trouble. But I closed it and still have the same problem. I know this is probably a silly error on my part but I would like to figure it out. I'm new to superuser, but the answers I see are usually very good, so I thought I would post my first question. Thanks in advance.

    Read the article

  • Handshake violation when trying to access one website

    - by Miguel
    I have a TZ 190 Wireless Enhanced with SonicOS Enhanced 4.2.1.0-20e. Yesterday, people could access without any problems a bank website wich uses HTTPS. Today, it is imposible to access only that website, every other ones works without problems. When checking the log message filtering to my IP only, this is what appears and I suspect is the cause of this problem, because all other websites are working: Priority: Notice Category: Network Access Message: TCP handshake violation detected; TCP connection dropped Source: X.Y.Z.3, 51997, LAN (admin) Destination: 200.14.232.18, 443, WAN Notes: Handshake Timeout Where X.Y.Z.3 is my local IP. I've tried to change TCP Settings under Firewall option, and activated this options with no success: Enforce strict TCP compliance with RFC 793 and RFC 1122 and Enable TCP checksum enforcement I've also tried to find the MTU and at first I got: Packet needs to be fragmented but DF set But when I lower the value of ping -f -l to 1468 I got: Request timeout. Also I deactivate CFS in lan and wan zones. Nothing works. Can you please help me? Any Ideas?

    Read the article

  • Kerberos & signle-sign-on for website

    - by Dylan Klomparens
    I have a website running on a Linux computer using Apache. I've employed mod_auth_kerb for single-sign-on Kerberos authentication against a Windows Active Directory server. In order for Kerberos to work correctly, I've created a service account in Active Directory called dummy. I've generated a keytab for the Linux web server using ktpass.exe on the Windows AD server using this command: ktpass /out C:\krb5.keytab /princ HTTP/[email protected] /mapuser [email protected] /crypto RC4-HMAC-NT /ptype KRB5_NT_PRINCIPAL /pass xxxxxxxxx I can successfully get a ticket from the Linux web server using this command: kinit -k -t /path/to/keytab HTTP/[email protected] ... and view the ticket with klist. I have also configured my web server with these Kerberos properties: <Directory /> AuthType Kerberos AuthName "Example.com Kerberos domain" KrbMethodK5Passwd Off KrbAuthRealms EXAMPLE.COM KrbServiceName HTTP/[email protected] Krb5KeyTab /path/to/keytab Require valid-user SSLRequireSSL <Files wsgi.py> Order deny,allow Allow from all </Files> </Directory> However, when I attempt to log in to the website (from another Desktop with username 'Jeff') my Kerberos credentials are not automatically accepted by the web server. It should grant me access immediately after that, but it does not. The only information I get from the mod_auth_kerb logs is: kerb_authenticate_user entered with user (NULL) and auth_type Kerberos However, more information is revealed when I change the mod_auth_kerb setting KrbMethodK5Passwd to On: [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1939): [client xxx.xxx.xxx.xxx] kerb_authenticate_user entered with user (NULL) and auth_type Kerberos [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1031): [client xxx.xxx.xxx.xxx] Using HTTP/[email protected] as server principal for password verification [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(735): [client xxx.xxx.xxx.xxx] Trying to get TGT for user [email protected] [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(645): [client xxx.xxx.xxx.xxx] Trying to verify authenticity of KDC using principal HTTP/[email protected] [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1110): [client xxx.xxx.xxx.xxx] kerb_authenticate_user_krb5pwd ret=0 [email protected] authtype=Basic What am I missing? I've studied a lot of online tutorials and cannot find a reason why the Kerberos credentials are not allowing access.

    Read the article

  • my Website loss packet in 70% countries, how can i dertermine why its loss packets?

    - by user2511667
    I checked my website on google page speed tester, it show result 90/100. I checked my website on pingdom it shows good result there. When i check my website in cloudmonitor.ca.com, it shows good result in 30% countries and all other countries it show packet loss (100%) How we can determine why my website has packet loss? And what is its solution? Is this problem from my server or from my website? I created new html blank page and set it too my index page, after I tested, it still shows packet loss, guess this means the problem is not in my website. Here is live result When I visit my website in browser, website is working fine. But when i test my domain or IP 198.178.123.219 in command Prompt it shows "Request time out" Why time out in command prompt?

    Read the article

  • Difference between accessing a website using Local host and IP address

    - by Cdeez
    I have developed an ASP.NET website and deployed into my IIS server. Now to see that my IIS is installed fine, I type local host in my address bar, and I get the welcome screen of IIS and its documentation in a separate window. Now I gave the url of my website http://localhost/mysites/site2/Default.aspx I access my site. Also giving my IP address instead of local host like: http://192.168.1.46/mysites/site2/Default.aspx also works. Just out of curiosity I wanted to see what happens when I give my IP address in addressbar. It asks me a user name and password saying:The server 192.168.1.46:80 requires a user name and password. I donot know what user name and password it is asking, and as of my knowledge I thought localhost points to my own IP address internally. But what is the difference and also what username and password do I need for it? Update: On chrome and IE just giving localhost displays the welcome screen, but on mozilla, localhost is also asking for a username and password.

    Read the article

  • Looking for a short term solution to improve website performance with additional server

    - by Tanim Mirza
    I am working with a small team to run an internal website running with PHP 5.3.9, MySQL 5.0.77. All the files and database are hosted on a dedicated Linux machine with the following configuration: Intel Xeon E5450 8 CPU cores @3.00GHz, 2992.498 MHz, Cache 6148 KB, Cent OS – Red Hat Enterprise Linux Server release 5.4 We started small and then the database got bigger and now the website performance degraded significantly. We often get server space overrun, mysql overloaded with too many calls, etc. We don't have much experience dealing with these issues. We recently got another server that we were thinking to use to improve performance. Since it has better configuration, some of us wanted to completely move everything to the new machine. But I am trying to find out how we can utilize both machine for optimized performance. I found options such as MySQL clustering, Load balancer, etc. I was wondering if I could get any suggestion for this situation "How to utilize two machines in short term for best performance", that would be great. By short term we are looking for something that we can deploy in a month or so. Thanks in advance for your time.

    Read the article

  • Finding ALL currently used IP addresses of Website

    - by Patrick R
    What steps would you take to discover all (or close to all) IP addresses that are currently used by a website? How would you be as exhaustive as possible without calling a website admin and asking for the list of IP addresses? ;) nslookup works but will vary based on dns server queried. whois is another good tool. Dig, not bad. Let's use Facebook for example. I'm blocking that site for the majority our our company's users, but some are approved for "research". I can not easily use OpenDNS because we all appear to come from the same request IP address. I could change that but don't want to add more vlans than I already have. I also could use block something like regex facebook1 "facebook\.com" (I'm running a cisco firewall) but that's pretty easy to sidestep. All that being said, I'm asking about specifically about finding ip addresses for a domain and not for other methods that I can block a domain name.

    Read the article

  • Website hosting from home - IIS6

    - by Paul
    I'm wanting to host a few websites from home, primarily because I'm using some BETA Microsoft software (.NET 4 and EF) and don't want to install it on my production server which is hosted at eukhost.com. Basically, I'm completely new to this sort of thing. So far, here is what I've done: Registered the domain name at namecheap.com (let's call it mydomain.com) Gone to "Nameserver Registration" in the panel and entered my IP address for the NS1 and NS2 records (let's say the IP is 0.0.0.0). Gone to "Domain Name Server Setup" and entered ns1.mydomain.com & ns2.mydomain.com Forwarded requests from port 80 to my internal IP (let's say 192.168.1.254) Created the website in IIS (I'm just testing with a single website so far, so have not created any host header values) Now, if I type in the IP address (http://0.0.0.0) I get the site as expected. However, if I enter http://www.mydomain.com I get an error saying "DNS Error - Cannot find server". I'm aware that there is a service from DynDNS that will automatically change the IP if I have a dynamic address, however my IP has remained static since I installed the ISP (since October) so I don't need this. Is there any way that I can get the DNS to work just by configuring IIS or something in Windows? I don't really want to have to pay for any 3rd party service. Thanks,

    Read the article

  • wget crawling search results of news website

    - by kiltek
    I am trying to crawl the search results of a news website using wget. The name of the website is www.voanews.com. After typing in my search keyword and clicking search, it proceeds to the results. Then i can specify a "to" and a "from"-date and hit search again. After this the URL becomes: http://www.voanews.com/search/?st=article&k=mykeyword&df=10%2F01%2F2013&dt=09%2F20%2F2013&ob=dt#article and the actual content of the results is what i want to download. To achieve this I created the following wget-command: wget --reject=js,txt,gif,jpeg,jpg \ --accept=html \ --user-agent=My-Browser \ --recursive --level=2 \ www.voanews.com/search/?st=article&k=germany&df=08%2F21%2F2013&dt=09%2F20%2F2013&ob=dt#article Unfortunately, the crawler doesn't download the search results. It only gets into the upper link bar, which contains the "Home,USA,Africa,Asia,..." links and saves the articles they link to. It seems like he crawler doesn't check the search result links at all. What am I doing wrong and how can I modify the wget command to download the results search list links (and of course the sites they link to) only ?

    Read the article

  • How to download video from a website that uses flash player but

    - by TPR
    Possible Duplicate: Download Flash video file from any video site? Livestream.com seems to be using flash player to show both live streams and archived/recorded streams (meaning previously shown streams). I want to download the archived streams. I am assuming that it should be much easier to download archived video from the website compared to the live stream. Here is a sample video: http://www.livestream.com/copanamericana/video?clipId=pla_6f9f4d97-e48f-4b04-bcaa-18e281341b0f&utm_source=lslibrary&utm_medium=ui-thumb ^^ I am not interested in this particular video, just an example. Firefox plugins like DownloadHelper and all do not work. Any suggestions? If I look at the browsing cache, no matter what the website plays, all files have the same size! If I open them, of course no video gets played. So something clever/funny is going on with the flash player on livestream.com (yes, even the archives videos), so it is definitely not the same as downloading videos from youtube. However, ads played on livestream.com videos are properly stored in browser cache.

    Read the article

  • sporadic routing to another website when opening a common url

    - by user226098
    I have a strange problem in our office: Sometimes when opening a url from one of our projects random url in any browser not the right website shows up but some other website. In most of the cases it redirects to google.com with some parameters like https://www.google.de/?gfe_rd=cr&ei=krOOU8_kGcSKswadyYDQBw&gws_rd=ssl or just the ugly google 404 page). But today it remains on the origial url but shows up the the content of http://debug.netdna-cdn.com/. This happens about 1 time a week and for no apparent reason. Even stranger it only occurs on a single pc in the network. It now happens on two different computers in the network. Both use windows 8. The problem cannot be fixed by clearing the browser cache but by rebooting the pc or using ipconfig /flushdns. So I think it has something to do with the dns cache of the machine. But I have no idea what the reason is for this and how i can figure out how to solve it. Any ideas?

    Read the article

  • License for website article

    - by queueoverflow
    On my personal website, I have some technical ideas and some source code snippets that I share with everybody. To make it clean that everyone can use those snippets as they like, as long as I do not have to provide any warranty, I would like to add a license to the some of the texts. The bigger programs come with GPLv2+, which I think is a reasonable license for free code. Does it make sense to use the MIT License or the GNU Free Documentation License for these texts or should I just go with CC-BY? I am a German citizen, so I heard that the American licenses do not really apply to me at all. If so, that would be another advantage for the Creative Commons Family of License.

    Read the article

  • Website with login: everything works. Website without login: menu items don't redirect to content

    - by user3660755
    I wanted to put the website online. I saw it still had a login screen. It needs to be accesible for everyone. unpublishing the module did not work. I checked the user access view and took the login url out of the template manager. After this the login screen was gone at the page. But when I click on any of the website menu items, it doesn't redirect to the content. When I do have the login screen and put in the username and password all the menu's work just fine. I checked the url it is the same with or without login. How is this possible? I have been asking a lot of people, but no ones seems to be able to give me a hint. I have been searching for the solution myself for more than a week, I just don't know anymore. My guess is that there must be a conflict in the redirection, but I am not skilled enough to recognize it I am affraid. Any tips would be more than welcome. Thank you

    Read the article

  • Error instantiating Texture2D in MonoGame for Windows 8 Metro Apps

    - by JimmyBoh
    I have an game which builds for WindowsGL and Windows8. The WindowsGL works fine, but the Windows8 build throws an error when trying to instantiate a new Texture2D. The Code: var texture = new Texture2D(CurrentGame.SpriteBatch.GraphicsDevice, width, 1); // Error thrown here... texture.setData(FunctionThatReturnsColors()); You can find the rest of the code on Github. The Error: SharpDX.SharpDXException was unhandled by user code HResult=-2147024809 Message=HRESULT: [0x80070057], Module: [Unknown], ApiCode: [Unknown/Unknown], Message: The parameter is incorrect. Source=SharpDX StackTrace: at SharpDX.Result.CheckError() at SharpDX.Direct3D11.Device.CreateTexture2D(Texture2DDescription& descRef, DataBox[] initialDataRef, Texture2D texture2DOut) at SharpDX.Direct3D11.Texture2D..ctor(Device device, Texture2DDescription description) at Microsoft.Xna.Framework.Graphics.Texture2D..ctor(GraphicsDevice graphicsDevice, Int32 width, Int32 height, Boolean mipmap, SurfaceFormat format, Boolean renderTarget) at Microsoft.Xna.Framework.Graphics.Texture2D..ctor(GraphicsDevice graphicsDevice, Int32 width, Int32 height) at BrewmasterEngine.Graphics.Content.Gradient.CreateHorizontal(Int32 width, Color left, Color right) in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Graphics\Content\Gradient.cs:line 16 at SampleGame.Menu.Widgets.GradientBackground.UpdateBounds(Object sender, EventArgs args) in c:\Projects\Personal\GitHub\BrewmasterEngine\SampleGame\Menu\Widgets\GradientBackground.cs:line 39 at SampleGame.Menu.Widgets.GradientBackground..ctor(Color start, Color stop, Int32 scrollamount, Single scrollspeed, Boolean horizontal) in c:\Projects\Personal\GitHub\BrewmasterEngine\SampleGame\Menu\Widgets\GradientBackground.cs:line 25 at SampleGame.Scenes.IntroScene.Load(Action done) in c:\Projects\Personal\GitHub\BrewmasterEngine\SampleGame\Scenes\IntroScene.cs:line 23 at BrewmasterEngine.Scenes.Scene.LoadScene(Action`1 callback) in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Scenes\Scene.cs:line 89 at BrewmasterEngine.Scenes.SceneManager.Load(String sceneName, Action`1 callback) in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Scenes\SceneManager.cs:line 69 at BrewmasterEngine.Scenes.SceneManager.LoadDefaultScene(Action`1 callback) in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Scenes\SceneManager.cs:line 83 at BrewmasterEngine.Framework.Game2D.LoadContent() in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Framework\Game2D.cs:line 117 at Microsoft.Xna.Framework.Game.Initialize() at BrewmasterEngine.Framework.Game2D.Initialize() in c:\Projects\Personal\GitHub\BrewmasterEngine\BrewmasterEngine\Framework\Game2D.cs:line 105 at Microsoft.Xna.Framework.Game.DoInitialize() at Microsoft.Xna.Framework.Game.Run(GameRunBehavior runBehavior) at Microsoft.Xna.Framework.Game.Run() at Microsoft.Xna.Framework.MetroFrameworkView`1.Run() InnerException: Is this an error that needs to be solved in MonoGame, or is there something that I need to do differently in my engine and game?

    Read the article

  • Polling a web URL until event

    - by Jaxo
    I'm really sorry about the crappy title - if anybody has a better way of wording it, please edit it! I basically need to have a C# application run a function if the output of a URL is a certain value. For example, if the website says blue the background colour will be blue, red to make it red, etc. The problem is I don't want to spam my webserver with checks. The 4 bytes it downloads each time is negligible, but if I were to deploy this type of system on multiple computers, it would get slower and slower and the bandwidth would add up quickly. So my question is: How can my desktop application run a piece of code only when a web URL has a different output without checking each time? I can't use sockets, and any sort of LAN protocol won't end up working. My reasoning behind this potentially nefarious code is to be able to mute computers by updating a file on the website (as you may have seen in my previous question today, sorry!). I'd like it to be rather quick, and not have the refresh time minutes apart, a few seconds at the most would be ideal. How can I accomplish this? The website's code is easy, but getting the C# application to check when it changes is the part I'm stuck on. Nothing shows up on the website other than the command. Thanks!

    Read the article

  • I am getting a 400 Bad Request error when using Nginx and PHP-FPM, why?

    - by Bob
    I am trying to run a website (that requires PHP - it technically doesn't require MySQL at this time, but it may sometime in the near future as I continue developing it, so I went ahead and installed that as well) using nginx 1.2.4 and PHP-FPM 5.3.3 on Ubuntu 12.04.1 LTS. As far as I know, I haven't done anything wrong, but clearly something is not quite right - I seem to be getting a 400 Bad Request error whenever I try to browse to my website. I've been mostly following one guide, and I've done more or less everything it recommends, except for not setting up PHP-FPM to use a Unix Socket and I used service as opposed to /etc/init.d/ when starting/stopping nginx, PHP, and MySQL. Anyways, here are my relevant configuration files (I have only censored personal/sensitive details, like my domain name - which contains my real name): /etc/nginx/nginx.conf user www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 15; types_hash_max_size 2048; # server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_disable "msie6"; # gzip_vary on; # gzip_proxied any; # gzip_comp_level 6; # gzip_buffers 16 8k; # gzip_http_version 1.1; # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; ## # nginx-naxsi config ## # Uncomment it if you installed nginx-naxsi ## #include /etc/nginx/naxsi_core.rules; ## # nginx-passenger config ## # Uncomment it if you installed nginx-passenger ## #passenger_root /usr; #passenger_ruby /usr/bin/ruby; ## # Virtual Host Configs ## include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } /etc/nginx/sites-enabled/subdomain.mydomain.net server { listen 80; # listen for IPv4 listen [::]:80; # listen for IPv6 server_name www.subdomain.mydomain.net subdomain.mydomain.net; access_log /srv/www/subdomain.mydomain.net/logs/access.log; error_log /srv/www/subdomain.mydomain.net/logs/error.log; location / { root /srv/www/subdomain.mydomain.net/public; index index.php; } location ~ \.php$ { try_files $uri =400; include fastcgi_params; fastcgi_split_path_info ^(.+\.php)(/.+)$; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /srv/www/subdomain.mydomain.net/public$fastcgi_script_name; } } All the directories listed in the configuration files above are correct on my server (to the extent of my knowledge). I have not included /etc/php5/fpm/pool.d/www.conf or /etc/php5/fpm/php.ini in this post as they're rather long, but I have posted them on Pastebin: http://pastebin.com/ensErJD8 and http://pastebin.com/T23dt7vM, respectively. Although, the only thing I've changed in either of the two files was in php.ini, where I set expose_php to off so as to hide the .php file extension from users. What can I do to resolve my issue? Please let me know if I need to supply any additional details.

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >