Search Results

Search found 18715 results on 749 pages for 'website attack'.

Page 120/749 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • What exactly is SEO friendly site?

    - by Tom
    Hey, So, I've seen web developers writing in their CV that they create "SEO friendly sites. ". Also I heard that Wordpress is SEO friendly site and other CMSs. So, what does SEO friendly site mean? I understand, that titles and URLs are probably the most important things for making good positions in google, but is there any other things which I should know? Thanks

    Read the article

  • css to replace characters in paragraph tag

    - by Thariama
    Already checked exisitng questions for this, but didn't find an exact match. My aim is to replace characters (like spaces) on a webpage with a small image using css. Example: <p><span>This is a text</span></p> becomes: <p><span>ThisIMGisIMGaIMGtext</span></p> (where IMG stands for a visible image (middot-pic for a space f.e.)) I cannot think of a suitable css selector. But myabe one of you guys (or girls) know a solution. Is this possible at all?

    Read the article

  • What is the need of JavaScript while developing a page?

    - by nepc
    I have been developing websites for some time now and I hardly use any Javascript in my pages. Whatever I can want to do with JavaScript, it is possible through PHP. Just like ajax itself. We can send a regular request instead of a ajax request, can't we? We can use "include" to include sub part of pages. So am i missing something about javascript, that I dont know of?

    Read the article

  • Main page content populated on the fly?

    - by jcovert
    Is there any reason to NOT have a webpage retrieve it's main content on the fly? For example, I have a page that has a header and a footer, and in the middle of this page is an empty div. When you click on one of the buttons in the header, an http GET is done behind the scenes and the .innerHTML() of the empty div is replaced with the result. I can't think of any reason why this might be a bad idea, but I can't seem to find any pages out there that do it? Please advise!

    Read the article

  • Implementing an Online Waiting Room

    - by saalon
    My organization is building a new version of our ticketing site and is looking for the best way to build an online waiting room when the number of users in our purchase path exceeds a certain limit. The best version of this queue would let new users in after existing users have either completed their purchase or have exceeded a timeout limit after entering the path. I'm trying to get an idea of how this has been implemented by other organizations. Has anyone out there done something similar or have any experience with this? We have some ideas, but I'd like to get a sense of what solutions have been tried and what problems those solutions have run up against. Just to be complete, this site is being built in Ruby on Rails, though I'd love to hear about how people have solved this regardless of platform.

    Read the article

  • GetUserDetails error Error 27 The name 'IMGUserLabel' does not exist in the current context

    - by FBEvo1
    I need to get the users name displayed next to the Users picture. The Picture works great however the IMGUserLabel says it not in context. . can you help me solve this? public void GetUserDetails(int id) { string getUserDetail = "Select ID,Email,Name,Country,Convert(varchar (20), RegisterDate, 106) RegisterDate,Convert(varchar (20), LastLogin, 106) LastLogin ,Description,ImageName FROM [User] where Id='" + id + "'"; dt = dbClass.ConnectDataBaseReturnDT(getUserDetail); if (dt.Rows.Count > 0) { IMGUserLabel.Text = dt.Rows[0]["Name"].ToString(); NameLabel.Text = dt.Rows[0]["Name"].ToString(); UserImage.ImageUrl = "~/UserImage/" + dt.Rows[0]["ImageName"].ToString(); lblCreated.Text = dt.Rows[0]["RegisterDate"].ToString(); LabelLastLogin.Text = dt.Rows[0]["LastLogin"].ToString(); lblCreated.Text = dt.Rows[0]["RegisterDate"].ToString(); LabelAboutMe.Text = dt.Rows[0]["Description"].ToString(); } } ///////////// .Aspx ////////// <a href="<%#GetUserDetails(GetUser(Int Id)%>"> <asp:Label ID="IMGUserLabel" runat="server" Text="Label" Font-Names="Segoe UI" Font-Size="Larger" ForeColor="White" src="<%#GetUserDetails(GetUser(Int Id)%>"> </asp:Label> </a> he name 'IMGUserLabel' does not exist in the current context?

    Read the article

  • Do parent website application pools serve child application pools as well?

    - by Mike G
    I am running a .NET web application in its own application pool on IIS7. The parent website is set to run in its own application pool. Today we noticed a huge number of connections going to IIS. I tried to browse a plain ol' .html page in the directory of the web application and it hangs. I then try to browse another plain .html file in the root of the parent website, and it too hangs. In performance monitor, i see there are some 8k connections to the default website and climbing. I cant seem to understand if my application was the problem, or IIS itself. If it was my application, wouldnt the html page in the root of the parent website still be able to be served? edit: Also, if i shut down the app pool to my application, the html page on the root of the parent website is still not able to be displayed.

    Read the article

  • How did my email end up in spam? Spam only filters this specific email, other email contents work

    - by mugetsu
    My website has users buy our products and when the purchase completes, it sends the user an email. However, this email always ends up in spam! When the user first registers, the site also sends an email, this email however is not filtered and goes into the normal inbox. I'm not quite sure why this is so, gmail vaguely tells me that " It's similar to messages that were detected by our spam filters." So I'm thinking that I need to reword the following email better. Can I get some tips? Or could something else be causing this? thanks! here's the unformatted email: Delivered-To: [email protected] Received: by 10.112.32.98 with SMTP id h2csp61953lbi; Tue, 20 Mar 2012 21:09:13 -0700 (PDT) Received: by 10.180.79.72 with SMTP id h8mr22836827wix.1.1332302953175; Tue, 20 Mar 2012 21:09:13 -0700 (PDT) Return-Path: <[email protected]> Received: from mail26.elasticemail.org (mail26.elasticemail.org. [178.32.180.26]) by mx.google.com with SMTP id 6si518487wiz.41.2012.03.20.21.09.12; Tue, 20 Mar 2012 21:09:12 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 178.32.180.26 as permitted sender) client-ip=178.32.180.26; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 178.32.180.26 as permitted sender) [email protected]; dkim=pass [email protected] DKIM-Signature: v=1; a=rsa-sha1; bh=qjc8jxQuGy9pLN1YV9TR2PHQYKg=; c=relaxed/relaxed; d=website.com; s=api; h=DomainKey-Signature:MIME-Version:From:To:List-Unsubscribe:Subject:Date:Reply-To:Message-ID:Content-Type; b=Odt+nYhjntXPl7JPVHeJWjkStemt6so+FPVYY6oMKziMFzmW8YiLhN8WwSLY0faMcn/rirKsO2dOm/kvcHlqUJC7ldhaydE6bPekkBDa9kBovlGwPNm6xy9QWPP9I1fXDLDCwqqeAXv8kN0daXbh3pVyqWNUOk5cgQ35OgpQpKI= DomainKey-Signature: q=dns; a=rsa-sha1; c=simple; d=website.com; s=api; h=MIME-Version:X-Mailer:From:To:X-Priority:List-Unsubscribe:Subject:Date:Reply-To:Message-ID:Content-Type; b=F7NNZIEyEV+64uYD8pVpe91WLP19Tw2Whk4OKpkLeAfkmrNIA7AjP0XYU1JWTlEyibHQJjjbhR62I3MvVJBSGp75eWfOuwb2AqYWZ/jAlMWznnfQLVv7OlYJsErGxYP6GUNNcuJaqlTPFDanJwtaEvR+tqXZRB7xrUisMd8lq2I= MIME-Version: 1.0 X-Mailer: email.website.com From: "Website Contact" <[email protected]> To: [email protected] X-Priority: 3 (Normal) List-Unsubscribe: <http://email.website.com/tracking/unsubscribe?msgid=su6g-8kfd0s0g>, <mailto:[email protected]?subject=unsubscribe> Subject: Website Tickets: event Date: Wed, 21 Mar 2012 04:09:17 +0000 Reply-To: "Website Contact" <[email protected]> Message-ID: <4tlaxecj2jy8.su6g-8kfd0s0g@email.website.com> Content-Type: multipart/alternative; boundary="----=_NextPart_000_3F77_7A0DF805.A8C886C0" ------=_NextPart_000_3F77_7A0DF805.A8C886C0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: base64 SGVsbG8hIAoKIEhlcmUgYXJlIHlvdXIgdGlja2V0KHMpIGZvciBDVEFTIGVDc1RBU3kgMjAxMjog CgpodHRwczovL2NhbXB1c2FtcC5jb20vP3RpY2tldHMvNy95aGloZ3Znd3Z3cWR3cXhtdnQKClNp bXBseSBicmluZyBpdCB3aXRoIHlvdSBvbiB5b3VyIHNtYXJ0cGhvbmUsIG9yIHByaW50IHRoZSB0 aWNrZXQgb3V0IHRvIGJlIHNjYW5uZWQgYXQgdGhlIGV2ZW50LiBFbmpveSwgYW5kIHdlIGFwcHJl Y2lhdGUgeW91ciBwdXJjaGFzZS4KClNpbmNlcmVseSwKVGhlIENhbXB1c0FtcCBUZWFt ------=_NextPart_000_3F77_7A0DF805.A8C886C0 Content-Type: text/html; charset="utf-8" Content-Transfer-Encoding: base64 SGVsbG8hIDxici8+PGJyLz4gSGVyZSBhcmUgeW91ciB0aWNrZXQocykgZm9yIENUQVMgZUNzVEFT eSAyMDEyOjxici8+PGEgaHJlZj0iaHR0cDovL2VtYWlsLmNhbXB1c2FtcC5jb20vdHJhY2tpbmcv Y2xpY2s/bXNnaWQ9c3U2Zy04a2ZkMHMwZyZ0YXJnZXQ9aHR0cHMlM2ElMmYlMmZjYW1wdXNhbXAu Y29tJTJmJTNmdGlja2V0cyUyZjclMmZ5aGloZ3Znd3Z3cWR3cXhtdnQiPiBodHRwczovL2NhbXB1 c2FtcC5jb20vP3RpY2tldHMvNy95aGloZ3Znd3Z3cWR3cXhtdnQgIDwvYT4gPGJyLz48YnIvPlNp bXBseSBicmluZyBpdCB3aXRoIHlvdSBvbiB5b3VyIHNtYXJ0cGhvbmUsIG9yIHByaW50IHRoZSB0 aWNrZXQgb3V0IHRvIGJlIHNjYW5uZWQgYXQgdGhlIGV2ZW50LiBFbmpveSwgYW5kIHdlIGFwcHJl Y2lhdGUgeW91ciBwdXJjaGFzZS48YnIvPjxici8+U2luY2VyZWx5LDxici8+VGhlIENhbXB1c0Ft cCBUZWFtPGltZyBzcmM9Imh0dHA6Ly9lbWFpbC5jYW1wdXNhbXAuY29tL3RyYWNraW5nL29wZW4/ bXNnaWQ9c3U2Zy04a2ZkMHMwZyIgc3R5bGU9IndpZHRoOjFweDtoZWlnaHQ6MXB4IiBhbHQ9IiIg Lz4= ------=_NextPart_000_3F77_7A0DF805.A8C886C0--

    Read the article

  • Using SEO to hide a website in a specific location?

    - by mickburkejnr
    Hi everyone, A friend of mine wants to build a website, but doesn't want people in the West Midlands area of the UK to be able to see it, but wants areas outside of the West Midlands to be able to see it. Is this possible? I know SEO can be used to target specific countries to improve search results, but could it be used to target specific areas inside a country and to basically remove the website from Google listings for a specific area? Cheers!

    Read the article

  • Why Use Different Types of Media For Your Website?

    Here we will look at the different types of media available and the impact which they can have both on the traffic to your website as well as the potential SEO benefits which each can bring. First your articles. Articles can be a great way of "filling" a website up.

    Read the article

  • Using Google Webmaster & Analytics, what data to look at to improve website performance?

    - by Rob
    Using data from Google Analytics and Webmaster tools, what data should I be looking at to improve my websites performance? I want to improve the SEO, usability and just general performance of my website. EDIT: It's a portfolio website that we've done the initial SEO for, also optimised all images etc and made the site as fast as possible. What kind of things should I be looking out for in the analytics and webmaster data to improve performance for both the SEO and each individual page.

    Read the article

  • No description for any page on the website is available in Google despite robots.txt allowing crawling

    - by Abhijit
    I seem to have the weirdest issue with Search Engine Optimization, and I asked the IT folks at my university, I asked people on Joomla forums and I am trying to sort this issue out using Google Webmaster Tools for more than 2 months to little avail. I want to know if I have some blatantly wrong configuration somewhere that is causing search engines to be unable to index this site. I noticed a similar issue with another website I searched for online (ECEGSA - The University of British Columbia at gsa.ece.ubc.ca), making me believe this might be a concern that people might be looking an answer for. Here are the details: The website in question is: http://gsa.ece.umd.edu/. It runs using Joomla 2.5.x (latest). The site was up since around mid December of 2013, and I noticed right from the get go that the site was not being indexed correctly on Google. Specifically I see the following message when I search for the website on Google: A description for this result is not available because of this site's robots.txt – learn more. The thing is in December till around March I used the default Joomla robots.txt file which is: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Nothing there should stop Google from searching my website. And even more confusingly, when I go to Google Webmaster tools, under "Blocked URLs" tab, when I try many of the links on the site, they are all shown up as "Allowed". I then tried adding a sitemap, putting it in the robots.txt file. That did not help. Same exact search result, same behavior in the "Blocked URLs" tab on the webmaster tools. Now additionally, the "sitemaps" tab says for several links an error saying "URL is robotted out". I tried those exact links in the "Blocked URLs" and they are allowed! I then tried deleting the robots.txt file. No use. Same exact problem. Here is an example screenshot from Google's Webmaster Tools: At this point I cannot give a rational explanation to why this is happening and neither can anyone in the IT department here. No one on Joomla forums can seem to understand what is going on. Based on what I explained, does it seem that I have somehow set a setting in the robots.txt or in .htaccess or somewhere else, incorrectly?

    Read the article

  • Optimize SEO: 2 websites or 1 main website and subdomain? [duplicate]

    - by waanders
    This question already has an answer here: Subdomain versus subdirectory 4 answers I'm working on a WordPress website of a little company, let say: www.xxx.com. Now they want a different website for their workshops. What is the most optimal construction thinking of SEO? 1) www.xxx.com + www.xxx-workshops.com 2) www.xxx.com + www.xxx.com/workshops 3) www.xxx.com + workshops.xxx.com

    Read the article

  • Should users be deleted after inactivity on a website?

    - by Hovaness Bartamian
    When you have a social website or a website where you can register, would you eventually delete them after a certain time (after a year of inactivity) or would you rather keep their account records for ever? I know websites like Facebook have large amount of inactive, duplicated and fake accounts. So I'm wondering if after two years of inactivity it would be alright to send the account a warning email of deletion unless they log in. Just thinking of a clean and efficient database management or any implications this may cause to new potential users.

    Read the article

  • What should I aware of , when preparing a document of website for later maintenance use?

    - by user782104
    The development team has finished a website and my duty is to prepare a document so that other programmer can maintain the website with ease . As I am inexperience of that, I would like to ask what should be mentioned (document structure) in that report? So far my idea is only prepare a ERD diagarm for database and flow chart for each function. Any other suggestions, eg. what cookies stored ? Thanks

    Read the article

  • How to change my website's appearance in a Facebook wall post?

    - by Lode
    When posting a website link in a Facebook wall post, Facebook fetches some content (title, text and image) from the website to show it to readers. Is there a way I can adjust / propose which content is used / preferred by Facebook? I found someone saying to use <meta property="og:image" content="image.jpg">, but this doesn't seem to have any effect. But maybe Facebook caches these results for a while?

    Read the article

  • How auto load image work when an status is post to social website?

    - by huahsin68
    When posting a status update on social website like facebook.com and linkedin.com which contain an URL, it will automatically scan the images available on the particular website and put it at the front of the status update. May I know how this could be done if I would like to do the same for my web app? Which web framework (JSF, Richfaces, JQuery, ...) should I use in such development? Beside that, is there any pre-build features available in blogger.com or wordpress.com?

    Read the article

  • I want to be a developer (website and web application) and want to choose asp.net as a programming lang

    - by jeet
    I want to be a developer (website and web application) and want to choose asp.net as a programming lang. I am currently an intermediate web designer. My friend told me that there are many things in asp.net My question is I am interested in website and web application development Is there any specific thing I have to learn to be a web developer or I have to learn the whole. Also, what does a developer has to do? key skills if he choose asp.net? Thanks :)

    Read the article

  • How to avoid lawsuits for a website that could enable illegal actions by users?

    - by Richard DesLonde
    How do you avoid getting into trouble with the law or lawsuits for a website that could enable illegal or potentially harmful activity by its users. The only thing I can think of is to make the content wholly user-generated, and thereby I have no control or say or responsibility for what goes on there. As an example, a website that shows great ski and snowboarding locations...well people could get hurt, or ski and snowboard on land where they are tresspassing, etc. Thoughts?

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >