Search Results

Search found 18450 results on 738 pages for 'website attacks'.

Page 178/738 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • Google Analytics - bad experiences? (esp. adult content)

    - by Litso
    Hello all, I work for a rather large adult website, and we're currently not using Google Analytics. There is an internal debate going on about whether we should start using Analytics, but there is hestitation from certain parties. The main argument is that they fear that Google will get too much insight into our website, and might even block us from the index as a result based on our adult content. Has anyone here ever had such an experience, or know stories about bad experiences with Google Analytics in such a manner? I personally think it will only improve our website if we were able to use Analytics, but the dev team was asked to look into possible negative effects. Any help would be appreciated.

    Read the article

  • How to Organize Subdirectories? [closed]

    - by Gary Iverson
    I am working on my first major (several hundred pages) website design and development project. I want to create subdirectories for each of the website's categories, which in turn have their own subdirectories for easy user navigation. Example: website.com/directory. I am aware that by placing an index.html file within each directory, the browser automatically detects and navigates to that page, but that seems like a messy solution (having multiple index.html files, albeit in distinct folders), and I cannot imagine everyone does that. So my question is...how do you properly organize and use subdirectories?

    Read the article

  • Avoiding "double" subscriptions

    - by john smith
    I am working on a website that requires a bit of marketing; let me explain. This website is offering a single, say, iTunes 50$ voucher to a lucky winner. To be entered in the draw, you need to invite (and has to join) at least one friend to the website. Pretty straightforward. Now, of course it would be easy for anyone to just create a fake account and invite that account so, I was thinking of some other way to somehow find out of possible cheating. I was thinking of an IP check on the newly subscribed (invited) user, and if there is the same IP logged in the last 24 hours, and if that's the case, investigate more about it. But I was thinking that maybe there is a more clever way around this issue. Has anyone ever though about this? What other solutions did you try? Thanks in advance.

    Read the article

  • When Google AdSense says "Site does not comply with Google policies"?

    - by Danny
    Recently for my website, I received a new kind of email from Google AdSense after trying it for 4 times and solving all the previous disapprove reasons. But from the below message, it seems all my website is okay for getting AdSense. Issues: Site does not comply with Google policies. Further detail: Please provide your own suggestion to solve this issue for my website. I would like to request you not to close this question unless I get an answer. Since 9 months I am strugglling for this. Let some one to reply for my doubt.

    Read the article

  • Search engine bots accessing strange URLs

    - by casasoft
    We have ELMAH enabled on our site and get errors whenever a Page Not Found error is triggered on the website. We have recently redesigned a new website and so we understand that search engine robots might have previously indexed pages which they try to access and result in a Page Not Found errors. For this reason, we have set up permanent redirects for such previously indexed pages to the respective new pages. The website in mention is www.chambercollege.com and for example, a previously indexed URL was www.chambercollege.com/special-offers.aspx. This page is no longer accessible so we have created the necessary permanent redirect to redirect to the respective page on www.chambercollege.com/en/content/special-offers-161/. Now we are starting to receive Page Not Found errors of search engine bots (e.g. MSN bot) trying to access the URL www.chambercollege.com/special-offers.aspx/images/shadow_right.jpg/. Any idea how could a search engine make up that strange URL and whether you have any suggestions of what to do best?

    Read the article

  • What measures can be taken to make sure Google is aware of the existence of a newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. I want to get important-page.html indexed as soon as possible after 12:00. Ideally within seconds or minutes. What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

  • How to create a temporary staging server on my home machine [closed]

    - by Homunculus Reticulli
    Possible Duplicate: What things required to host a website at home I want to create a temporary staging server which can be accessed (i.e. via browser) by other people that I want to show the website to (a business partner who is half way accross the world). IIRC, my ISP issues dynamic addresses so I may need to register with a (DNS server?) - not sure about this. Although I'm a software developer, I don't know much about the hardware side of things - and would appreciate help in getting me setup so I can show a website to a business partner. Here are the relevant details: Web server: Apache 2.2 OS: Ubuntu 10.0.4 LTS modem/router: ZyXel P-600

    Read the article

  • How can I estimate the entropy of a password?

    - by Wug
    Having read various resources about password strength I'm trying to create an algorithm that will provide a rough estimation of how much entropy a password has. I'm trying to create an algorithm that's as comprehensive as possible. At this point I only have pseudocode, but the algorithm covers the following: password length repeated characters patterns (logical) different character spaces (LC, UC, Numeric, Special, Extended) dictionary attacks It does NOT cover the following, and SHOULD cover it WELL (though not perfectly): ordering (passwords can be strictly ordered by output of this algorithm) patterns (spatial) Can anyone provide some insight on what this algorithm might be weak to? Specifically, can anyone think of situations where feeding a password to the algorithm would OVERESTIMATE its strength? Underestimations are less of an issue. The algorithm: // the password to test password = ? length = length(password) // unique character counts from password (duplicates discarded) uqlca = number of unique lowercase alphabetic characters in password uquca = number of uppercase alphabetic characters uqd = number of unique digits uqsp = number of unique special characters (anything with a key on the keyboard) uqxc = number of unique special special characters (alt codes, extended-ascii stuff) // algorithm parameters, total sizes of alphabet spaces Nlca = total possible number of lowercase letters (26) Nuca = total uppercase letters (26) Nd = total digits (10) Nsp = total special characters (32 or something) Nxc = total extended ascii characters that dont fit into other categorys (idk, 50?) // algorithm parameters, pw strength growth rates as percentages (per character) flca = entropy growth factor for lowercase letters (.25 is probably a good value) fuca = EGF for uppercase letters (.4 is probably good) fd = EGF for digits (.4 is probably good) fsp = EGF for special chars (.5 is probably good) fxc = EGF for extended ascii chars (.75 is probably good) // repetition factors. few unique letters == low factor, many unique == high rflca = (1 - (1 - flca) ^ uqlca) rfuca = (1 - (1 - fuca) ^ uquca) rfd = (1 - (1 - fd ) ^ uqd ) rfsp = (1 - (1 - fsp ) ^ uqsp ) rfxc = (1 - (1 - fxc ) ^ uqxc ) // digit strengths strength = ( rflca * Nlca + rfuca * Nuca + rfd * Nd + rfsp * Nsp + rfxc * Nxc ) ^ length entropybits = log_base_2(strength) A few inputs and their desired and actual entropy_bits outputs: INPUT DESIRED ACTUAL aaa very pathetic 8.1 aaaaaaaaa pathetic 24.7 abcdefghi weak 31.2 H0ley$Mol3y_ strong 72.2 s^fU¬5ü;y34G< wtf 88.9 [a^36]* pathetic 97.2 [a^20]A[a^15]* strong 146.8 xkcd1** medium 79.3 xkcd2** wtf 160.5 * these 2 passwords use shortened notation, where [a^N] expands to N a's. ** xkcd1 = "Tr0ub4dor&3", xkcd2 = "correct horse battery staple" The algorithm does realize (correctly) that increasing the alphabet size (even by one digit) vastly strengthens long passwords, as shown by the difference in entropy_bits for the 6th and 7th passwords, which both consist of 36 a's, but the second's 21st a is capitalized. However, they do not account for the fact that having a password of 36 a's is not a good idea, it's easily broken with a weak password cracker (and anyone who watches you type it will see it) and the algorithm doesn't reflect that. It does, however, reflect the fact that xkcd1 is a weak password compared to xkcd2, despite having greater complexity density (is this even a thing?). How can I improve this algorithm? Addendum 1 Dictionary attacks and pattern based attacks seem to be the big thing, so I'll take a stab at addressing those. I could perform a comprehensive search through the password for words from a word list and replace words with tokens unique to the words they represent. Word-tokens would then be treated as characters and have their own weight system, and would add their own weights to the password. I'd need a few new algorithm parameters (I'll call them lw, Nw ~= 2^11, fw ~= .5, and rfw) and I'd factor the weight into the password as I would any of the other weights. This word search could be specially modified to match both lowercase and uppercase letters as well as common character substitutions, like that of E with 3. If I didn't add extra weight to such matched words, the algorithm would underestimate their strength by a bit or two per word, which is OK. Otherwise, a general rule would be, for each non-perfect character match, give the word a bonus bit. I could then perform simple pattern checks, such as searches for runs of repeated characters and derivative tests (take the difference between each character), which would identify patterns such as 'aaaaa' and '12345', and replace each detected pattern with a pattern token, unique to the pattern and length. The algorithmic parameters (specifically, entropy per pattern) could be generated on the fly based on the pattern. At this point, I'd take the length of the password. Each word token and pattern token would count as one character; each token would replace the characters they symbolically represented. I made up some sort of pattern notation, but it includes the pattern length l, the pattern order o, and the base element b. This information could be used to compute some arbitrary weight for each pattern. I'd do something better in actual code. Modified Example: Password: 1234kitty$$$$$herpderp Tokenized: 1 2 3 4 k i t t y $ $ $ $ $ h e r p d e r p Words Filtered: 1 2 3 4 @W5783 $ $ $ $ $ @W9001 @W9002 Patterns Filtered: @P[l=4,o=1,b='1'] @W5783 @P[l=5,o=0,b='$'] @W9001 @W9002 Breakdown: 3 small, unique words and 2 patterns Entropy: about 45 bits, as per modified algorithm Password: correcthorsebatterystaple Tokenized: c o r r e c t h o r s e b a t t e r y s t a p l e Words Filtered: @W6783 @W7923 @W1535 @W2285 Breakdown: 4 small, unique words and no patterns Entropy: 43 bits, as per modified algorithm The exact semantics of how entropy is calculated from patterns is up for discussion. I was thinking something like: entropy(b) * l * (o + 1) // o will be either zero or one The modified algorithm would find flaws with and reduce the strength of each password in the original table, with the exception of s^fU¬5ü;y34G<, which contains no words or patterns.

    Read the article

  • IIS isn't propagating domain

    - by ErocM
    I called Godaddy and 'verified' my settings for the two ips were correct. ns1.asezo.com = xx.xx.xx.15 ns2.asezo.com = xx.xx.xx.16 then I set the nameserver of asezo.com to the ns1/ns2 above, which Godaddy tech support says is right. When I try to visit my site, it says Oops! Google Chrome could not find asezo.com. When I try to ping the website, it gives me a time out. I have the bindings in IIS for the default website as: http - xx.xx.xx.15 - 80 www.asezo.com and http - xx.xx.xx.15 - 80 asezo.com And I'm still getting nothing. I can go directly to the ip http://xx.xx.xx.15/ and it gives me the IIS default website, I just can't get the url to propagate. What am I doing wrong?

    Read the article

  • looking for advice regarding a free shopping cart solution [on hold]

    - by thirdCharm
    I am building a very small e-commerce site, and I need a free and simple to use/deploy/integrate shopping cart that I can add into my website in order to be able to sell a FEW items. I want the shopping cart to be an add-on in my website, nothing fancy.Ideally when a person clicks on the "Add to cart" button, they will be redirected to the shopping cart, which will then handle different types of payment methods, and everything else you would expect from a fully working shopping cart. I am currently developing my website using the following tools/frameworks: SQL Server 2008 R2 Visual Studio 2010 (ASP.NET 4.5 - C#) HTML5,CSS3, and JS. I am interested in also using PayPal alongside my shopping cart. Help of any kind is appreciated!!

    Read the article

  • How to refuse to give an access to passwords to a customer without being unprofessional or rude?

    - by MainMa
    Let's say you're creating a website for a customer. This website has its own registration (either combined with OpenID or not). The customer asks you to be able to see the passwords the users are choosing, given that the users will probably be using the same password on every website. In general, I say: either that it is impossible to retrieve the passwords, since they are not stored in plain text, but hashed, or that I have no right to do that or that administrators must not be able to see the passwords of users, without giving any additional details. The first one is false: even if the passwords are hashed, it is still possible to catch and store them on each logon (for example doing a strange sort of audit which will remember not only which user succeeded or failed to logon, but also with which password). The second one is rude. How to refuse this request, without being either unprofessional or rude?

    Read the article

  • Why does Chrome crash on websites using the Awesome font?

    - by harry
    Since github using webfront for icons, every time I open github website that tab will crash. But other websites are not crash. I did everything I can, like disable all extensions, clean all cache, even create a new user profile, still not working. I'm using Ubuntu linux 12.04, last stable google chrome browser, always up to date. I searched from google and not found any one has the issue like mine. And I found any website using Font-Awesome my chrome tab will crash if I open that website.

    Read the article

  • Google Analytics - bad experiences? (esp. adult content)

    - by Litso
    I work for a rather large adult website, and we're currently not using Google Analytics. There is an internal debate going on about whether we should start using Analytics, but there is hestitation from certain parties. The main argument is that they fear that Google will get too much insight into our website, and might even block us from the index as a result based on our adult content. Has anyone here ever had such an experience, or know stories about bad experiences with Google Analytics in such a manner? I personally think it will only improve our website if we were able to use Analytics, but the dev team was asked to look into possible negative effects. Any help would be appreciated.

    Read the article

  • How to recover my inclusion in google results after being penalized for receiving comment spam?

    - by UXdesigner
    My website had very high search engine results, especially in Google. But I left the website for a couple of months and didn't notice the comments were full of SPAM, about 20k comments of SPAM. Then i checked my google results and I'm out of google ! After years of having good results, no spam, how can I now recover from that? The spam problem has been solved completely. No more spam, and the website is very legit and very nice. Well, at least I think I was penalized, I don't see any other reason.

    Read the article

  • Google analytics - vistor path to specific site destination setup and monitoring?

    - by Joshc
    I have a website which I am using google analytics to track visitors and track our banner campaigns. We're are promoting 'Purchase Ticket' buttons on our website which push visitors to a third party website who sell and distribute our tickets. The url on all the 'Purchase Ticket' buttons are the same through out the site... Example: http://ticketmaestro.com/events/my-event-2012 In the analytic control panel, is it possible so set something up, where I create a path-to-destination using the above example url? ...and then after this is setup: I want to be able to monitor the path visitors are taking from when they reach the site - to when they click the 'Purchase Ticket' button. Graphs will show... Start Destination Path to Final Destination Final Destination: http://ticketmaestro.com/events/my-event-2012 Any help, suggestions, terminology would be great thanks. Josh

    Read the article

  • How do I add restrictions for users to sign up before they can access web site?

    - by user1867842
    How do I get my webpage not to go back when they hit the back button and are logged out and how can I add a web page to be blocked like FACEBOOK doesn't let you get into their site with out having a page or a account with them, and if you try to put something in the url and try to go to something on their site it gives you a web page that says "you have to be logged in first" . Like I don't want someone going to the url of the "index" page before they have signed up as a member they need to make an account first then they can have access to the "index" page. How do I do this. I have a website so far that has a database and the website has 5 pages so far and two of them which is the login and sign up page which are both used with php and mysql and they work fine. How do I restrict access to the main website by first having the users sign up with me for an account.

    Read the article

  • Fresh start outside Google's crapbox [on hold]

    - by Krzysztof Minister Bytu
    I might have been experimenting with my website too much and Google first cut the flow of visitors considerably and now I didn't get one for 4 days already. It's a joke that they've done this, because I've put a lot of work into it, but that's a topic for another day. My question is about further avoiding it. I want to take the partly improved design from that website onto a new one and get a new domain name. The question is: in that case, do I have to change the hosting option (it has my old website name in the address), or is changing the domain enough for Google to treat it as something new from a "fresh user". In other words, does Google get through the domain address and log into the actual hosting address? I'd hate to waste another few months of hard work, so I prefer to take every possible precaution but not paying for another hosting would make things easier on the wallet.

    Read the article

  • Tips to Keep in Mind While Choosing an SEO Company

    If you own a website and want to have the serious profit with the help of that website then you have to find some of the ethical means by which the website can be promoted in the market and more crowds can come on your site. For this you can take advice from the people who are already into this business and can share some ideas with them. Nowadays the most common technique which is being used in the market is the SEO also known as Search Engine Optimization. There are many people on the web world who are familiar with this term but don't have a full idea. For such people there are many Search Engine Optimization Companies which are assisting the people.

    Read the article

  • Which one of the Google Analytics stats should be my top priority to improve SEO: visits, pageviews, bounce rate, new visitors, or visit duration

    - by HOY
    My site is getting a lot of traffic from an image (a company logo image) because this image is ranked 1st in Google search results for a company's title. (I have no idea how that happened.) This image is must for my website, but it is not relevant to the site content, so irrelevant people search for the image and find out about my site. I get interesting statistics: Pros: Total Visits & Avg. New Visits Cons: Avg. Page/Visit, Avg. Visit Duration, Bounce Rate I am confused if this image is helpful to my website. I don't know what the balance between those 5 statistics should be. My website is 2 months old, and we are working on SEO at the moment. Edit: here are the details about the image: Search Keyword: arcelik logo Search Site: google.com.tr Search URL

    Read the article

  • Help with a CMS for content only not display

    - by user2091756
    Hello I'm trying to make some kind of tool for an school website, what I need to do is to make students take a test and according to what are the results (27 posibilities) they get a set of activities (questions) according to their level which they can solve in around 3 months logging periodically to the website, plus I need teachers to log and look at the reports. Now, I'm a graphic designer myself so my skills are mostly html5 and css3 and I know some php (edit existing ones only) and javascript (jquery) as well, most people tell me that I need a CMS to do the tool but all I find is CMS for display like blogs or news websites which I think aren't useful for me because the website is already made in html and css3 only (I need to add an extra page for the tool) I understand I need to create users and give them special rights according to what type of user they are and I also understand that I need a database where I can store all my questions. What is the best way to do this? what do you suggest me? Thanks

    Read the article

  • Removing "www." from domain name and SEO

    - by TecMan
    We are doing a big redesign work on our website, and at least 50% of the website folders will be moved to new places with different names (i.e. many URL will be changed). Sure, Google will need some time to index new pages and we expect our SERP positions will be not so good as they are now for some time. We also have an old idea to remove www from our domain name. It seems, it's the right time to do these two works together with publishing the website with updated contents. Or is it better from SEO perspective first publish the new contents, and only after some time, when our SERP positions will return to prior results, tell Google that the domain name without www is our preferred domain name?

    Read the article

  • Are there advantages of using hard coded URLs for localization?

    - by nbolton
    On the Synergy website, localization is detected (and can be overridden) but uses the same URL for all languages. Some websites however, like Wikipedia have language specific subdomains. What are the advantages of having either subdomains or subdirectories (i.e. a specific URL) for each language localization? Also, should it automatically redirect the user to the specific subdomain/subdir based on the language that the browser requests? I suspect that there are advantages, which I'm guessing are: When the website appears in search results for non-English languages, the translated page description will be shown (assuming there is a translation provided by the website). When a user shares a page (e.g. through twitter), it will show in a specific language. Perhaps this is a disadvantage though? Am I correct, if so, are there more advantages?

    Read the article

  • What is the most secure environment for multiple CMS sites? [closed]

    - by Brian Gulino
    I wish to run about 50 Joomla or WordPress low-traffic websites on 1 server, or part of a server. Each website will be managed by its own, naive owner who will have be able to access the Joomla or Wordpress backend of the website. I am concerned about security and isolation as my users will periodically get into trouble by not protecting their sites properly. Two alternatives I know of exist: Run one Linux system with multiple websites under Apache. Follow current Joomla and WordPress security tips. Increase the isolation of the individual sites by using mpm-itk, which will allow each website to run as its own user. The alternative to this is to run virtualization software such as the Xen hypervisor. Each site would have its own, virtual Linux system. I lack the experience needed to make this decision and I am asking which path to take. Obviously, there may be other alternatives that I haven't considered.

    Read the article

  • How to manually list set of urls for search engines to index

    - by MarutiB
    So I have created a video website which has thousands of videos and thousands of videos get added to it on a daily basis. Here is my problem :- I have created a website which basically loads the skeleton in html and puts all the content through javascript and Ajax. The problem is search engines aren't going anywhere except for the home page. Is there a way say in robots.txt where i give a link to a single html which has links to all these videos ? I agree my site is not accessible for a non-javascript user but stats show that this ratio is very low ( 0.2 % ). Is there a way I can still keep the complete AJAX website and still get each individual videos listed on google ?

    Read the article

  • will any one solve my GOOGLE ranking Confusion? [on hold]

    - by India SEO Analyst
    Iam handling website www.usamovingandstorage.com and targeting keywords "chicago movers", but my website is on third page. But my website has nice backlink, and recently i removed irrelevant backlinks also. I compared my competitors' websites such as www.ampolmoving.com, www.chicagomovers.com and have no such big backlinks, but they are ranking first page in google. I compared the three websites in www.opensiteexplorer.org. In that my site has good results. Then How is it happened? I need full comparison, why my site is ranking in third page? what are the actions i need to take to rank in first page.

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >