Search Results

Search found 18450 results on 738 pages for 'website attacks'.

Page 59/738 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • Would it be possible to build a client portal on Squarespace6?

    - by aBathologist
    I'm helping a family member set up a site which will need to include a secure client portal, providing access to documents and a simple database. I have been encouraging them to go with a more established, open source CSM like drupal or joomla, whose capability in this area is evident. However, they have a strong preference for Squarespace. Does any one know if it would be possible to accomplish this with the new developer platform for squarespace 6? I've spent well over an hour searching google, the squarespace site and stackexchange, but can't seem to find any clear answer to this question. I'm grateful for any insight you all can provide.

    Read the article

  • Browser problem for background-size property [migrated]

    - by Sangram
    I am using one picture as my background of header of my blog. CSS i have used is #header-wrapper { height:125px; padding: 0px; margin: 0; background: url("http://3.bp.blogspot.com/_lxBSX0YJV58/TOspWPI1r-I/AAAAAAAAA34/uw872WFS3ME/s1600/headerbg.jpg") top center no-repeat; background-size: 1120px 124px; } original width of an image is 990 px and i made it 1120px using property background-size: 1120px 124px; It looks okay in firefox 4 and Opera 11 but doesn't work in IE 7, Palemoon etc. image size does not increases and remains 990 px. You can check my blog HERE Any help...how can i make it compatible with all browsers ?

    Read the article

  • Is there a generally accepted maximum number of http requests for a web page load?

    - by MorganTiley
    I'm looking into optimizing my web application's client side performance but cannot figure out a good goal for # of http requests for a page. How does YSlow calculate the grade? This doesn't seem to be documented. Also it seems many sites like linkedin.com, amazon.com get an F grade but the page still loads quite fast. How does it fail but still perform well? Gmail get's an A grade and it has 43 unprimed/10 primed requests.

    Read the article

  • How to tackle archived who-is personal data with opt-out?

    - by defaye
    As far as I understand it, it is possible to opt-out (in the UK at least) of having your address details displayed on who-is information of a domain for non-trading individuals. What I want to know is, after opt-out, how do individuals combat archived data? Is there any enforcement of this? How many who-is websites are there which archive data and what rights do we have to force them to remove that data without paying absurd fees? In the case of capitulating to these scoundrels, what point is it in paying for the removal of archived data if that data can presumably resurface on another who-is repository? In other words, what strategy is one supposed to take, besides being wiser after the fact?

    Read the article

  • Looking for advice on B2B promotion [closed]

    - by IconicDigital
    Can anyone recommend affiliate networks that focus on B2B development. We are about to launch a UK job search engine that allows job boards to list their jobs on the engine. We have decided to keep the advertising in house, with the goal being of keeping the costs down. I was wondering if anyone could offer any advice on potential advertising routes that we could take. For example B2B affiliate networks, adwords etc. We are in the position of launching an empty site and ideally we would like to be attracting recruitment agencies or businesses to signup to either a free or paid account. They can then begin to populate the engine with job listings. An obvious choice so far would be to promote on networks like Linked In. Any ideas? Thanks

    Read the article

  • I want something ready to start with

    - by BDotA
    I am looking for something quick like a weblog in wordpress or blogspot maybe, that when I write a blog post I can put it there, for example if I write something about .NET or Java or Database,..some quick tutorial with some small code samples that visitors can use ... And I don't know anything about webdesign and I just want a ready-made thing to use for this purpose. What do you suggest? any samples of that that I can take a look?

    Read the article

  • Is there a colloquial term for the web dev equivalent of "Script Kiddies"? [on hold]

    - by Darkcat Studios
    I recently came across a couple of lets say "young people" who were convinced that they were "Web Developers", despite the fact that they were only able to configure a wordpress template, and not even able to edit images properly. So this got me thinking, "back in the day" kids who thought they were hackers but just ran other peoples scripts they had found, were known as Script Kiddies. Do we have a term for these yet?

    Read the article

  • How can I get stats for what 3rd-party sites have embedded our iframe widget?

    - by Su'
    Say we've produced a widget for other sites to use, like so: <iframe src="http://example.com/whatever.php" frameBorder="0" width="200px" height="300px" scrolling="no"></iframe> The client would like to be able to see within GA who has embedded the thing. Is there some referer information automatically passed that I can look for, or do I need to add something? whatever.php is already loading the analytics Javascript(we're also tracking clicks on an outbound link). [EDIT] Looking around a bit more, I found what seems to be a similar question on SO with an answer saying this can be found, automatically, but I still can't seem to find the information. The question's also old enough the respondent is probably referring to the old interface, though. Maybe someone could explain getting to it in the new look. (I won't likely be able to train this client to switch, deal with the old look, etc.)

    Read the article

  • Webcam Q&A page with guest "speakers" ideas?

    - by myladeybugg
    I want to host one way webcam Q&A's/AMA's and embed it on my sites webpage. I must be able to have specified hosts for these Q&A's connect quickly and easily to the webpage, and read user posted messages (questions) for interaction. I know there are some systems out there, like tinychat, but I'm unaware if there is anything exactly used for what I am looking for. Perhaps requiring a specific account/password before allowing streaming on the page, where I can email the host a password or directions to create an account to begin live streaming. Most of these people doing the question and answers on webcam will have little technical knowledge, so extra cookie points goes to ease of use! Thanks for ideas in advance! PS: I apologize if this is the incorrect section for posting this question, this is my first.

    Read the article

  • Dynamic mod_rewrite or how to plan a dynamic website

    - by Sophia Gavish
    Hi, I'm trying to make a clean url for a blog on a dynamic website, but I think that the problem is that I don't know how to plan the website schema. I read about how to use mod_rewrite and all I found is how to make "http://www.website.com/?category&date&post-title" to "http://www.website.com/category/date/post-title". that's works o.k for me. The problem is that If my url looks like "http://www.website.com/blog/?id=34" this method won't work as far as I got it. So, I have two questions: 1. Is there a way to use mod_rewrite (maybe read from a txt file) to read the post title of my blog and rewrite my url by date and post-title? 2. Should I rewrite my website to query the data from one index file in the homepage and use mod_rewrite to write the nice url? should I query also the date and the title of the post instead just the post ID?

    Read the article

  • Embed external website inside a page

    - by jasongullickson
    I'd like to load something from website B into a page on website A and contain the functionality of website B within a container on website A. I tried doing this using a div and jQuery's load() method but I run into cross-domain-scripting issues (I think, it works with a local file but not a remote URL). I also tried using an iframe but strange things happen (for example, when a link is clicked in the "contained" website B, it reloads the entire browser, losing the content of website A). I've read about some server-side ways of handling this (and it may just come to that) but ideally I want something completely client side, JavaScript and HTML. Any ideas?

    Read the article

  • Detecting login credentials abuse

    Greetings. I am the webmaster for a small, growing industrial association. Soon, I will have to implement a restricted, members-only section for the website. The problem is that our organization membership both includes big companies as well as amateur “clubs” (it's a relatively new industry…). It is clear that those clubs will share the login ID they will use to log onto our website. The problem is to detect whether one of their members will share the login credentials with people who would not normally supposed to be accessing the website (there is no objection for such a club to have all it’s members get on the website). I have thought about logging along with each sign-on the IP address as well as the OS and the browser used; if the OS/Browser stays constant and there are no more than, say, 10 different IP addresses, the account is clearly used by very few different computers. But if there are 50 OS/Browser combination and 150 different IPs, the credentials have obviously been disseminated far, and there would be then cause for action, such as modifying the password. Of course, it is extremely annoying when your password is being unilaterally changed. So, for this problem, I thought about allowing the “clubs” to manage their own list of sub-accounts, and therefore if abuse is suspected, the user responsible would be easily pinned-down, and this “sub-member” alone would face the annoyance of a password change. Question: What potential problems would anyone see with such an approach?

    Read the article

  • Wishful Thinking: Why can't HTML fix Script Attacks at the Source?

    - by Rick Strahl
    The Web can be an evil place, especially if you're a Web Developer blissfully unaware of Cross Site Script Attacks (XSS). Even if you are aware of XSS in all of its insidious forms, it's extremely complex to deal with all the issues if you're taking user input and you're actually allowing users to post raw HTML into an application. I'm dealing with this again today in a Web application where legacy data contains raw HTML that has to be displayed and users ask for the ability to use raw HTML as input for listings. The first line of defense of course is: Just say no to HTML input from users. If you don't allow HTML input directly and use HTML Encoding (HttyUtility.HtmlEncode() in .NET or using standard ASP.NET MVC output @Model.Content) you're fairly safe at least from the HTML input provided. Both WebForms and Razor support HtmlEncoded content, although Razor makes it the default. In Razor the default @ expression syntax:@Model.UserContent automatically produces HTML encoded content - you actually have to go out of your way to create raw HTML content (safe by default) using @Html.Raw() or the HtmlString class. In Web Forms (V4) you can use:<%: Model.UserContent %> or if you're using a version prior to 4.0:<%= HttpUtility.HtmlEncode(Model.UserContent) %> This works great as a hedge against embedded <script> tags and HTML markup as any HTML is turned into text that displays as HTML but doesn't render the HTML. But it turns any embedded HTML markup tags into plain text. If you need to display HTML in raw form with the markup tags rendering based on user input this approach is worthless. If you do accept HTML input and need to echo the rendered HTML input back, the task of cleaning up that HTML is a complex task. In the projects I work on, customers are frequently asking for the ability to post raw HTML quite frequently.  Almost every app that I've built where there's document content from users we start out with text only input - possibly using something like MarkDown - but inevitably users want to just post plain old HTML they created in some other rich editing application. See this a lot with realtors especially who often want to reuse their postings easily in multiple places. In my work this is a common problem I need to deal with and I've tried dozens of different methods from sanitizing, simple rejection of input to custom markup schemes none of which have ever felt comfortable to me. They work in a half assed, hacked together sort of way but I always live in fear of missing something vital which is *really easy to do*. My Wishlist Item: A <restricted> tag in HTML Let me dream here for a second on how to address this problem. It seems to me the easiest place where this can be fixed is: In the browser. Browsers are actually executing script code so they have a lot of control over the script code that resides in a page. What if there was a way to specify that you want to turn off script code for a block of HTML? The main issue when dealing with HTML raw input isn't that we as developers are unaware of the implications of user input, but the fact that we sometimes have to display raw HTML input the user provides. So the problem markup is usually isolated in only a very specific part of the document. So, what if we had a way to specify that in any given HTML block, no script code could execute by wrapping it into a tag that disables all script functionality in the browser? This would include <script> tags and any document script attributes like onclick, onfocus etc. and potentially also disallow things like iFrames that can potentially be scripted from the within the iFrame's target. I'd like to see something along these lines:<article> <restricted allowscripts="no" allowiframes="no"> <div>Some content</div> <script>alert('go ahead make my day, punk!");</script> <div onfocus="$.getJson('http://evilsite.com/')">more content</div> </restricted> </article> A tag like this would basically disallow all script code from firing from any HTML that's rendered within it. You'd use this only on code that you actually render from your data only and only if you are dealing with custom data. So something like this:<article> <restricted> @Html.Raw(Model.UserContent) </restricted> </article> For browsers this would actually be easy to intercept. They render the DOM and control loading and execution of scripts that are loaded through it. All the browser would have to do is suspend execution of <script> tags and not hookup any event handlers defined via markup in this block. Given all the crazy XSS attacks that exist and the prevalence of this problem this would go a long way towards preventing at least coded script attacks in the DOM. And it seems like a totally doable solution that wouldn't be very difficult to implement by vendors. There would also need to be some logic in the parser to not allow an </restricted> or <restricted> tag into the content as to short-circuit the rstricted section (per James Hart's comment). I'm sure there are other issues to consider as well that I didn't think of in my off-the-back-of-a-napkin concept here but the idea overall seems worth consideration I think. Without code running in a user supplied HTML block it'd be pretty hard to compromise a local HTML document and pass information like Cookies to a server. Or even send data to a server period. Short of an iFrame that can access the parent frame (which is another restriction that should be available on this <restricted> tag) that could potentially communicate back, there's not a lot a malicious site could do. The HTML could still 'phone home' via image links and href links potentially and basically say this site was accessed, but without the ability to run script code it would be pretty tough to pass along critical information to the server beyond that. Ahhhh… one can dream… Not holding my breath of course. The design by committee that is the W3C can't agree on anything in timeframes measured less than decades, but maybe this is one place where browser vendors can actually step up the pressure. This is something in their best interest to reduce the attack surface for vulnerabilities on their browser platforms significantly. Several people commented on Twitter today that there isn't enough discussion on issues like this that address serious needs in the web browser space. Realistically security has to be a number one concern with Web applications in general - there isn't a Web app out there that is not vulnerable. And yet nothing has been done to address these security issues even though there might be relatively easy solutions to make this happen. It'll take time, and it's probably not going to happen in our lifetime, but maybe this rambling thought sparks some ideas on how this sort of restriction can get into browsers in some way in the future.© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  HTML5  HTML  Security   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • How to wipe RAM on shutdown (prevent Cold Boot Attacks)?

    - by proper
    My system is encrypted using Full Disk Encryption, i.e. everything except /boot is encrypted using dmcrypt/luks. I am concerned about Cold Boot Attacks. Prior work: https://tails.boum.org/contribute/design/memory_erasure/ http://tails.boum.org/forum/Ram_Wipe_Script/ http://dee.su/liberte-security http://forum.dee.su/topic/stand-alone-implementation-of-your-ram-wipe-scripts Can you please provide instructions on how to wipe the RAM once Ubuntu is shutdown/restarted? Thanks for your efforts!

    Read the article

  • Mobile traffic of my second site redirected to my base site [Edited,have a look]

    - by Faheem Rasheed
    I have a strange issue with my website and it seems i am unable to understand what cauese the problem.I would highly appreciate your help.The Scenario is I have two websites. Website A Website B Website A is simply hosted within the root directory of my hosting account.Within this root *directory* i have a sub folder " subfolder A " and within which is another subfolder " subfolder B " that contains my site Website B so the path looks like root/sublfolder_A/subfolder_B/ that contains my **Website B ( i.e subfolder_B) ** All goes fine.When i access the website B from my desktop/Laptop Website B is loaded normally. but when i access the Website through a mobile device , mobile site of Website A is loaded while as it should load website_B Also , to let you know , both websites have different URL's and not subdomain's or anything. What could be the problem ? htaccess of website B or website A ? or something else ? Here is the htaccess of my website B RewriteEngine On RewriteRule .* index.php [F] RewriteBase / RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule .* index.php [L]

    Read the article

  • Swiching webhosting company & database erros.

    - by gipap
    Well here comes the situation. I used to have CompanyA for webhosting. (The hosting plan was a shared one). I decided to change the hosting provider and transfer my website to CompanyB, (exclusive IP). The issue that i face is that my webpage is now displayed in two different IP addresses. So i decided to turn-off the website served by the CompanyA. Now the problem is that my database driven website, served by CompanyB, is not driven anymore, although i have added the A record mssql.mywebsite.com with the ipaddress of the database. (The database is served by dedicated db's server). So, what am i doing wrong here?

    Read the article

  • My php homepage downloads index.php instead of being processed on Gandi.net

    - by alekone
    if I go to the homepage of my website http://www.website.com (on a brand new server) the index.php gets donwloaded instead than processed. I don't have the same problem on other folders. my .htaccess reads: AddHandler php5-script .php what could this be? I suspect it's something with php config / or htaccess, but I'm not able to figure it out. help please! edit: I don't know if this helps: it's a wordpress installation, I have this problem only on the public part of the website, not on the admin (that renders correctly)

    Read the article

  • Make my web-server traffic go through proxy?

    - by Eli
    I have a question that may or may not be possible. Basically Comcast isn't going to let me host a website on their ISP unless I have a business account. So, I spoke with my uncle who is big into networking and he told me to host my website through a proxy so Comcast cannot associate the website IP with my IP. I have purchases a proxy from proxy-hub.com, but I cannot seem to figure out what to do next. I may be approaching this totally wrong and I may need to create my own proxy server. Anybody have a clue? Thanks.

    Read the article

  • How to configure S3 or DNS to handle incomplete name (sans www) for web site?

    - by user193116
    I have a set up a bucket called "www.mydomainname.com" to host my website and I have configured the CNAME such that "www.mydomainname.com" points to the my endopint http://www.mydomainname.com.s3-website-us-east-1.amazonaws.com/ It works and when people who type the the full url "www.mydomainname.com" are able to see my index page But most people are in the habit of typing incoplete domain name -- they just type "mydomainname.com" and their browser fails to find my site. Is there a way to configure CName or S3 bucket such that typing "mydomainname.com" take them to my s3 website ? (I am using Networksolutions as my DNS provider).

    Read the article

  • Website Vulnerabilities

    - by Ben Griswold
    The folks at the Open Web Application Security Project publish a list of the top 10 vulnerabilities. In a recent CodeBrew I provided a quick overview of them all and spent a good amount of time focusing on the most prevalent vulnerability, Cross Site Scripting (XSS).  I gave an overview of XSS, stepped through a quick demo (sorry vulnerable site), reviewed the three XSS variations and talked a bit about how to protect one’s site.  References and reading materials were also included in the presentation and, look at that, they are provided here too. Open Web Application Security Project The OWASP Top Ten Vulnerabilities (pdf) OWASP List of Vulnerabilities The 56 Geeks Project by Scott Johnson ha.ckers.org OWASP XSS Prevention Cheat Sheet Wikipedia Is XSS Solvable?, Don Ankney The Anatomy of Cross Site Scripting, Gavin Zuchlinski

    Read the article

  • Enable Claims based Auth on a SP2010 website, after it has been provisioned

    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). This feed URL has been discontinued. Please update your reader's URL to : http://feeds.feedburner.com/winsmarts Read full article .... ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • list of things to think about for hosting a potentially high traffic website

    - by SpashHit
    I do my own hosting for a few clients on my own VPS server (Lindode). Since my clients so far have been extremely low traffic, I have not had to really dig into some of the considerations that I would need for a higher traffic site. Now I am bidding on a client whose site will be potentially higher (not Facebook or twitter, but higher than Joe's ice cream shop). Is there a list of things I need to think about that I may be missing? I am going to assume, at least at first, that I will be able to handle them on my shared Linode, but I could move to a dedicated Linode if need be. I am not thinking so far of multiple servers, but short of that there are still considerations. For example, mod_perl instead of straight CGI, better backups, etc. What else? In case it matters, the stack will be debian-linux / apache / Perl / mysql / Template Toolkit.

    Read the article

  • Tool to export Microsoft project to website?

    - by Rory
    Just wondering does anyone know of a free/open source tool that take a Microsoft project file and export it to HTML? I know you can save a project file as HTML, so wanted a tool that would do this automatically? Maybe also displaying graphs/gantt chart as well? If not, any ideas of how I would write a program to do this, preferably in java? I know of Aspose.Tasks (http://www.aspose.com/categories/.net-components/aspose.tasks-for-.net/default.aspx), which can export projects files to gantt charts in png format, but it's not free and is only available in C#.

    Read the article

  • Website Editor control for WYSIWYG/regions

    - by Dan Smith
    For lack of a better title, let me try to explain further: I'm looking for a control that will allow me to have a library of "page elements" (such as a list of employees, or a photo gallery, or a contact form, etc) that could be dragged onto the page canvas. The page canvas could have pre-set regions/boxes where these items could be drug into, preventing the user from screwing up the pages layout. I'm looking for any pre-built commercial (or open-source with commercial use allowed) tools available like this.

    Read the article

  • My software is hosted on a "bad" website. Can I do anything about it?

    - by Abluescarab
    The software I've created is hosted on what you could call a "bad" website. It's hard to explain, so I'll just provide an example. I've made a free password generator. This, along with most of my other FREE software, is available on this website. This is their description of my software: Platform: 7/7 x64/Windows 2K/XP/2003/Vista Size: 61.6 Mb License: Trial File Type: .7z Last Updated: June 4th, 2011, 15:38 UTC Avarage Download Speed: 6226 Kb/s Last Week Downloads: 476 Toatal Downloads: 24908 Not only is the size completely skewed, it is not trial software, it's free software. The thing is that it's not the description I'm worried about--it's the download links. The website is a scam website. They apparently link to "cracks" and "keygens", but not only is that in itself illegal, they actually link to fake download websites that give you viruses and charge your credit card. Just to list things that are wrong with this website: they claim all software is paid software then offer downloads for keygens and cracks; they fake all details about the program and any program reviews and ratings; they and the downloads site they link to are probably run by the same person, so they make money off of these lies. I'm only a teenager with no means to pursue legal action. This means that, unfortunately, I can't do anything that will actually get results. I'd like my software to only be downloaded off my personal website. I have links to four legitimate locations to download my software and that's it. Essentially, is there anything I can do about this? As I said above, I can't pursue legal action, but is there some way I can discourage traffic to that website by blacklisting it or something? Can I make a claim on MY website to only download my software from the links I provide? Or should I just pay no mind? Because, honestly, it's a bit of a ways back in Google results. Thank you ahead of time.

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >