Search Results

Search found 18522 results on 741 pages for 'website rec'.

Page 88/741 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • DVD (vob file) to online video viewer?

    - by Nick
    I've been sent a DVD which needs to be put onto a website, but I honestly don't even know where to start. Do I simply convert the file using some software to MP4(?!) and then use something like http://videojs.com/ to view it online? I'm really sorry for the vague question, but I want to produce the best quality results, with good compression, good quality and a nice video player interface. Would really appreciate any recommendations. Thank you!

    Read the article

  • Handshake violation when trying to access one website

    - by Miguel
    I have a TZ 190 Wireless Enhanced with SonicOS Enhanced 4.2.1.0-20e. Yesterday, people could access without any problems a bank website wich uses HTTPS. Today, it is imposible to access only that website, every other ones works without problems. When checking the log message filtering to my IP only, this is what appears and I suspect is the cause of this problem, because all other websites are working: Priority: Notice Category: Network Access Message: TCP handshake violation detected; TCP connection dropped Source: X.Y.Z.3, 51997, LAN (admin) Destination: 200.14.232.18, 443, WAN Notes: Handshake Timeout Where X.Y.Z.3 is my local IP. I've tried to change TCP Settings under Firewall option, and activated this options with no success: Enforce strict TCP compliance with RFC 793 and RFC 1122 and Enable TCP checksum enforcement I've also tried to find the MTU and at first I got: Packet needs to be fragmented but DF set But when I lower the value of ping -f -l to 1468 I got: Request timeout. Also I deactivate CFS in lan and wan zones. Nothing works. Can you please help me? Any Ideas?

    Read the article

  • Kerberos & signle-sign-on for website

    - by Dylan Klomparens
    I have a website running on a Linux computer using Apache. I've employed mod_auth_kerb for single-sign-on Kerberos authentication against a Windows Active Directory server. In order for Kerberos to work correctly, I've created a service account in Active Directory called dummy. I've generated a keytab for the Linux web server using ktpass.exe on the Windows AD server using this command: ktpass /out C:\krb5.keytab /princ HTTP/[email protected] /mapuser [email protected] /crypto RC4-HMAC-NT /ptype KRB5_NT_PRINCIPAL /pass xxxxxxxxx I can successfully get a ticket from the Linux web server using this command: kinit -k -t /path/to/keytab HTTP/[email protected] ... and view the ticket with klist. I have also configured my web server with these Kerberos properties: <Directory /> AuthType Kerberos AuthName "Example.com Kerberos domain" KrbMethodK5Passwd Off KrbAuthRealms EXAMPLE.COM KrbServiceName HTTP/[email protected] Krb5KeyTab /path/to/keytab Require valid-user SSLRequireSSL <Files wsgi.py> Order deny,allow Allow from all </Files> </Directory> However, when I attempt to log in to the website (from another Desktop with username 'Jeff') my Kerberos credentials are not automatically accepted by the web server. It should grant me access immediately after that, but it does not. The only information I get from the mod_auth_kerb logs is: kerb_authenticate_user entered with user (NULL) and auth_type Kerberos However, more information is revealed when I change the mod_auth_kerb setting KrbMethodK5Passwd to On: [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1939): [client xxx.xxx.xxx.xxx] kerb_authenticate_user entered with user (NULL) and auth_type Kerberos [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1031): [client xxx.xxx.xxx.xxx] Using HTTP/[email protected] as server principal for password verification [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(735): [client xxx.xxx.xxx.xxx] Trying to get TGT for user [email protected] [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(645): [client xxx.xxx.xxx.xxx] Trying to verify authenticity of KDC using principal HTTP/[email protected] [Fri Oct 18 17:26:44 2013] [debug] src/mod_auth_kerb.c(1110): [client xxx.xxx.xxx.xxx] kerb_authenticate_user_krb5pwd ret=0 [email protected] authtype=Basic What am I missing? I've studied a lot of online tutorials and cannot find a reason why the Kerberos credentials are not allowing access.

    Read the article

  • my Website loss packet in 70% countries, how can i dertermine why its loss packets?

    - by user2511667
    I checked my website on google page speed tester, it show result 90/100. I checked my website on pingdom it shows good result there. When i check my website in cloudmonitor.ca.com, it shows good result in 30% countries and all other countries it show packet loss (100%) How we can determine why my website has packet loss? And what is its solution? Is this problem from my server or from my website? I created new html blank page and set it too my index page, after I tested, it still shows packet loss, guess this means the problem is not in my website. Here is live result When I visit my website in browser, website is working fine. But when i test my domain or IP 198.178.123.219 in command Prompt it shows "Request time out" Why time out in command prompt?

    Read the article

  • Difference between accessing a website using Local host and IP address

    - by Cdeez
    I have developed an ASP.NET website and deployed into my IIS server. Now to see that my IIS is installed fine, I type local host in my address bar, and I get the welcome screen of IIS and its documentation in a separate window. Now I gave the url of my website http://localhost/mysites/site2/Default.aspx I access my site. Also giving my IP address instead of local host like: http://192.168.1.46/mysites/site2/Default.aspx also works. Just out of curiosity I wanted to see what happens when I give my IP address in addressbar. It asks me a user name and password saying:The server 192.168.1.46:80 requires a user name and password. I donot know what user name and password it is asking, and as of my knowledge I thought localhost points to my own IP address internally. But what is the difference and also what username and password do I need for it? Update: On chrome and IE just giving localhost displays the welcome screen, but on mozilla, localhost is also asking for a username and password.

    Read the article

  • Looking for a short term solution to improve website performance with additional server

    - by Tanim Mirza
    I am working with a small team to run an internal website running with PHP 5.3.9, MySQL 5.0.77. All the files and database are hosted on a dedicated Linux machine with the following configuration: Intel Xeon E5450 8 CPU cores @3.00GHz, 2992.498 MHz, Cache 6148 KB, Cent OS – Red Hat Enterprise Linux Server release 5.4 We started small and then the database got bigger and now the website performance degraded significantly. We often get server space overrun, mysql overloaded with too many calls, etc. We don't have much experience dealing with these issues. We recently got another server that we were thinking to use to improve performance. Since it has better configuration, some of us wanted to completely move everything to the new machine. But I am trying to find out how we can utilize both machine for optimized performance. I found options such as MySQL clustering, Load balancer, etc. I was wondering if I could get any suggestion for this situation "How to utilize two machines in short term for best performance", that would be great. By short term we are looking for something that we can deploy in a month or so. Thanks in advance for your time.

    Read the article

  • Thunderbird 3: create a single column to display the 'From' field on a received message and the 'Rec

    - by dfree
    If the title doesn't say it all let me know. This would be helpful for IMAP folders within T-bird, when I'm looking through threads on a particular issue, but don't know if I sent the last message to the recipient, or they sent the last message to me. I would be able to quickly scroll in one column and see exactly when the last communication on that issue was. Is there a way to make a 'smart' column that would do this?

    Read the article

  • Finding ALL currently used IP addresses of Website

    - by Patrick R
    What steps would you take to discover all (or close to all) IP addresses that are currently used by a website? How would you be as exhaustive as possible without calling a website admin and asking for the list of IP addresses? ;) nslookup works but will vary based on dns server queried. whois is another good tool. Dig, not bad. Let's use Facebook for example. I'm blocking that site for the majority our our company's users, but some are approved for "research". I can not easily use OpenDNS because we all appear to come from the same request IP address. I could change that but don't want to add more vlans than I already have. I also could use block something like regex facebook1 "facebook\.com" (I'm running a cisco firewall) but that's pretty easy to sidestep. All that being said, I'm asking about specifically about finding ip addresses for a domain and not for other methods that I can block a domain name.

    Read the article

  • Website hosting from home - IIS6

    - by Paul
    I'm wanting to host a few websites from home, primarily because I'm using some BETA Microsoft software (.NET 4 and EF) and don't want to install it on my production server which is hosted at eukhost.com. Basically, I'm completely new to this sort of thing. So far, here is what I've done: Registered the domain name at namecheap.com (let's call it mydomain.com) Gone to "Nameserver Registration" in the panel and entered my IP address for the NS1 and NS2 records (let's say the IP is 0.0.0.0). Gone to "Domain Name Server Setup" and entered ns1.mydomain.com & ns2.mydomain.com Forwarded requests from port 80 to my internal IP (let's say 192.168.1.254) Created the website in IIS (I'm just testing with a single website so far, so have not created any host header values) Now, if I type in the IP address (http://0.0.0.0) I get the site as expected. However, if I enter http://www.mydomain.com I get an error saying "DNS Error - Cannot find server". I'm aware that there is a service from DynDNS that will automatically change the IP if I have a dynamic address, however my IP has remained static since I installed the ISP (since October) so I don't need this. Is there any way that I can get the DNS to work just by configuring IIS or something in Windows? I don't really want to have to pay for any 3rd party service. Thanks,

    Read the article

  • wget crawling search results of news website

    - by kiltek
    I am trying to crawl the search results of a news website using wget. The name of the website is www.voanews.com. After typing in my search keyword and clicking search, it proceeds to the results. Then i can specify a "to" and a "from"-date and hit search again. After this the URL becomes: http://www.voanews.com/search/?st=article&k=mykeyword&df=10%2F01%2F2013&dt=09%2F20%2F2013&ob=dt#article and the actual content of the results is what i want to download. To achieve this I created the following wget-command: wget --reject=js,txt,gif,jpeg,jpg \ --accept=html \ --user-agent=My-Browser \ --recursive --level=2 \ www.voanews.com/search/?st=article&k=germany&df=08%2F21%2F2013&dt=09%2F20%2F2013&ob=dt#article Unfortunately, the crawler doesn't download the search results. It only gets into the upper link bar, which contains the "Home,USA,Africa,Asia,..." links and saves the articles they link to. It seems like he crawler doesn't check the search result links at all. What am I doing wrong and how can I modify the wget command to download the results search list links (and of course the sites they link to) only ?

    Read the article

  • How to download video from a website that uses flash player but

    - by TPR
    Possible Duplicate: Download Flash video file from any video site? Livestream.com seems to be using flash player to show both live streams and archived/recorded streams (meaning previously shown streams). I want to download the archived streams. I am assuming that it should be much easier to download archived video from the website compared to the live stream. Here is a sample video: http://www.livestream.com/copanamericana/video?clipId=pla_6f9f4d97-e48f-4b04-bcaa-18e281341b0f&utm_source=lslibrary&utm_medium=ui-thumb ^^ I am not interested in this particular video, just an example. Firefox plugins like DownloadHelper and all do not work. Any suggestions? If I look at the browsing cache, no matter what the website plays, all files have the same size! If I open them, of course no video gets played. So something clever/funny is going on with the flash player on livestream.com (yes, even the archives videos), so it is definitely not the same as downloading videos from youtube. However, ads played on livestream.com videos are properly stored in browser cache.

    Read the article

  • sporadic routing to another website when opening a common url

    - by user226098
    I have a strange problem in our office: Sometimes when opening a url from one of our projects random url in any browser not the right website shows up but some other website. In most of the cases it redirects to google.com with some parameters like https://www.google.de/?gfe_rd=cr&ei=krOOU8_kGcSKswadyYDQBw&gws_rd=ssl or just the ugly google 404 page). But today it remains on the origial url but shows up the the content of http://debug.netdna-cdn.com/. This happens about 1 time a week and for no apparent reason. Even stranger it only occurs on a single pc in the network. It now happens on two different computers in the network. Both use windows 8. The problem cannot be fixed by clearing the browser cache but by rebooting the pc or using ipconfig /flushdns. So I think it has something to do with the dns cache of the machine. But I have no idea what the reason is for this and how i can figure out how to solve it. Any ideas?

    Read the article

  • Website with login: everything works. Website without login: menu items don't redirect to content

    - by user3660755
    I wanted to put the website online. I saw it still had a login screen. It needs to be accesible for everyone. unpublishing the module did not work. I checked the user access view and took the login url out of the template manager. After this the login screen was gone at the page. But when I click on any of the website menu items, it doesn't redirect to the content. When I do have the login screen and put in the username and password all the menu's work just fine. I checked the url it is the same with or without login. How is this possible? I have been asking a lot of people, but no ones seems to be able to give me a hint. I have been searching for the solution myself for more than a week, I just don't know anymore. My guess is that there must be a conflict in the redirection, but I am not skilled enough to recognize it I am affraid. Any tips would be more than welcome. Thank you

    Read the article

  • Polling a web URL until event

    - by Jaxo
    I'm really sorry about the crappy title - if anybody has a better way of wording it, please edit it! I basically need to have a C# application run a function if the output of a URL is a certain value. For example, if the website says blue the background colour will be blue, red to make it red, etc. The problem is I don't want to spam my webserver with checks. The 4 bytes it downloads each time is negligible, but if I were to deploy this type of system on multiple computers, it would get slower and slower and the bandwidth would add up quickly. So my question is: How can my desktop application run a piece of code only when a web URL has a different output without checking each time? I can't use sockets, and any sort of LAN protocol won't end up working. My reasoning behind this potentially nefarious code is to be able to mute computers by updating a file on the website (as you may have seen in my previous question today, sorry!). I'd like it to be rather quick, and not have the refresh time minutes apart, a few seconds at the most would be ideal. How can I accomplish this? The website's code is easy, but getting the C# application to check when it changes is the part I'm stuck on. Nothing shows up on the website other than the command. Thanks!

    Read the article

  • I am getting a 400 Bad Request error when using Nginx and PHP-FPM, why?

    - by Bob
    I am trying to run a website (that requires PHP - it technically doesn't require MySQL at this time, but it may sometime in the near future as I continue developing it, so I went ahead and installed that as well) using nginx 1.2.4 and PHP-FPM 5.3.3 on Ubuntu 12.04.1 LTS. As far as I know, I haven't done anything wrong, but clearly something is not quite right - I seem to be getting a 400 Bad Request error whenever I try to browse to my website. I've been mostly following one guide, and I've done more or less everything it recommends, except for not setting up PHP-FPM to use a Unix Socket and I used service as opposed to /etc/init.d/ when starting/stopping nginx, PHP, and MySQL. Anyways, here are my relevant configuration files (I have only censored personal/sensitive details, like my domain name - which contains my real name): /etc/nginx/nginx.conf user www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 15; types_hash_max_size 2048; # server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_disable "msie6"; # gzip_vary on; # gzip_proxied any; # gzip_comp_level 6; # gzip_buffers 16 8k; # gzip_http_version 1.1; # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; ## # nginx-naxsi config ## # Uncomment it if you installed nginx-naxsi ## #include /etc/nginx/naxsi_core.rules; ## # nginx-passenger config ## # Uncomment it if you installed nginx-passenger ## #passenger_root /usr; #passenger_ruby /usr/bin/ruby; ## # Virtual Host Configs ## include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } /etc/nginx/sites-enabled/subdomain.mydomain.net server { listen 80; # listen for IPv4 listen [::]:80; # listen for IPv6 server_name www.subdomain.mydomain.net subdomain.mydomain.net; access_log /srv/www/subdomain.mydomain.net/logs/access.log; error_log /srv/www/subdomain.mydomain.net/logs/error.log; location / { root /srv/www/subdomain.mydomain.net/public; index index.php; } location ~ \.php$ { try_files $uri =400; include fastcgi_params; fastcgi_split_path_info ^(.+\.php)(/.+)$; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /srv/www/subdomain.mydomain.net/public$fastcgi_script_name; } } All the directories listed in the configuration files above are correct on my server (to the extent of my knowledge). I have not included /etc/php5/fpm/pool.d/www.conf or /etc/php5/fpm/php.ini in this post as they're rather long, but I have posted them on Pastebin: http://pastebin.com/ensErJD8 and http://pastebin.com/T23dt7vM, respectively. Although, the only thing I've changed in either of the two files was in php.ini, where I set expose_php to off so as to hide the .php file extension from users. What can I do to resolve my issue? Please let me know if I need to supply any additional details.

    Read the article

  • How to prevent access to website without SSL connection?

    - by CraigJ
    I have a website that has an SSL certificate installed, so that if I access the website using https instead of http I will be able to connect using a secure connection. However, I have noticed that I can still access the website non-securely, ie. by using http instead of https. How can I prevent people using the website in a non-secure manner? If I have a directory on the website, eg. samples/, can I prevent non-secure connections to just this directory?

    Read the article

  • Reason why a Brand new website is ranking for a top keyword? [on hold]

    - by Prasad EBK
    Its been noticed, one of our (new)competitor website is ranking 5 for a top keyword with high competition. The website is barely 2 months old. When I checked not much SEO is done on the website other than basic title/desc tags. No backlinks. The website pushed down our website and took its place for the keyword. The only reason that came to my mind is the latest penguin update. Or is the ranking just temporary???, will it eventually be pushed back?? its been holding on for atleast one month and its irritating. Thanks in advance.

    Read the article

  • Is there any good reason I would want my website to be framed?

    - by minitech
    I'm building a website that's not security-critical in any way at all, so having somebody put a page in an <iframe> is not particularly dangerous to its users. However, as my website doesn't have script plugins that will be used anywhere else, is there any reason why I shouldn't just apply: X-Frame-Options: Deny to every page on my website? Is there any valid reason for any other website to embed mine? I've seen plenty of content-stealing ones and attempts to hijack user accounts, but never an actual good usage of frames that's not an explicit feature of the website.

    Read the article

  • Do wordpress websites get indexed quicker by SE than a regular website?

    - by guisasso
    I registered a couple of domains with the names of categories of products we sell. I then installed wordpress in one of those domains and played around with it for a bit, and left it alone for about a month. There was a link on my regular website to that secondary website and that website was also registered in google's webmaster tools, but that's that. I then searched on google last week for that product category, and to my surprise, that secondary website showed up in the 2nd or 3rd page on google. Now my question is: Do search engines index wordpress websites quicker? I had given up on using wordpress for that website, since it's so simple, but should i use it, would it give me better results? Thanks in advance for the help, if the question is not deleted.

    Read the article

  • Will Google penalize my website if I hide the H1 tag?

    - by mickburkejnr
    I've read an article today where the author stated that if you put keywords on to your page but then hide them with CSS, Google will penalize your site. This make sense. This got me thinking though about my own technique when I build a website. If for example when I build a website and the logo contains the name of the website, I tend to put the name of the website in a H1 tag and then hide this tag. I don't know why I do it, I've always done it. I also include any text held in an image in the alt attribute of the img tag. But because I am hiding the H1 tag, does this leave me open to Google penalizing the website because I've hidden this one tag?

    Read the article

  • Will uploading our .docx files on scribd and embedding the files on our website affect search engine rankings?

    - by user1439968
    We have prepared notes for university students which are on .docx format. And we want it to put on our website for viewing. We tried one option. Uploading the files on scribd and embedding it on our website for viewing on scribd viewer. Will making documents available on srcibd viewer on our website affect search engine rankings ? Will search engines treat it as duplicate content as those are already uploaded on scribd and we are embedding it on our website ? On scribd we have set the uploaded documents as 'private' though. And if it affects, can you suggest any suitable way to make .docx files to be viewed on our website that doesn't affect search engine rankings ?

    Read the article

  • Should websites live in /var/ or /usr/ according to recommended usage?

    - by nbolton
    According to a guide on the Linux directory structure, /usr/ is for application files, and /var/ is for files that change (I assume this means "files that belong to the applications"). Is this correct? If this is the case then I'm a little torn between using either. A website is an application (if it's dynamic, so to speak), but in other cases it is just a collection of files used by Apache. The default www dir lives in /var/www/, so should we follow suit by using /var/websites/ (or something similar), or choose /usr/websites/ since they could be applications? This is a very trivial question, but it's bugging me nonetheless. For our case, I'm leaning toward /usr/web or something like that, since our websites are all applications. Update: This is for our company websites; it's not a shared hosting server, so we don't need to worry about separating them in /home/ or anything like that.

    Read the article

  • Strategies for very fast delivery of webpages.

    - by Cherian
    I run a website Cucumbertown with an initial pay load of nearly 9KB zipped. All my js is delayed loaded with requirejs and modernizer is the only exception. Now all my webpages are Nginx cached and only 10-15% hits go to the backend proxy. And the cache is invalidated by logged in users as proxy_cache_bypass. So for an anonymous user its nearly always a cache hit. I have some basic OS tuning with default via ip dev eth0 initcwnd 15 net.ipv4.tcp_slow_start_after_idle 0 Despite an all cache & large initcwnd my pages still take 2.5 – 3 seconds. I have a yslow score of And page speed at Are there strategies that can help deliver webpages even faster than this? Deliver pages at 1+ second time for 10KB payload? Notes: My servers run of a fairly good data center from Linode at Fremont.

    Read the article

  • How to contact an emergency service using only Internet?

    - by Vi.
    Suppose you are apart from the mobile or whatever phone (so can't call 112 or 911 or 999), but have have access to a computer with internet connection. How do you call/message an emergency service (of whatever country in hope they will route the request to the correct destination) using only Internet? Maybe there's some 911-like website or public SIP or whatever? Or better go to some chat/forum/StackExchange/whatever and ask somebody to make a call for you? (will users really believe?) /* 1. I'm not in any emergency, just curious. 2. I'm not sure on what SE site to ask this question. */

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >