Search Results

Search found 18210 results on 729 pages for 'website promotion'.

Page 121/729 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • please give guidence on making a social networking website

    - by user287745
    what is an api, if a make a db and a social site in asp.net and c#, sql host it. how can i ask others to make applications for it? i mean how is that possible, i have the code, so API, is it a software? i want to make a site like facebook social networking site, but its proving to be very very difficult, Ajax java script and all that please help by providing examples, codes, readable, guidance, thank you, i has asked a similar question but deleted it as it was not clear thank you

    Read the article

  • Multilanguage website sitemap

    - by Alex
    My site is i18n based on the following structure: mydomain.com/en mydomain.com/en/product/blue-widgets mydomain.com/fr mydomain.com/fr/product/blue-widgets The site is internationalised not localised, therefore i don't want to GEO target specific locals just target "french" or "english" speaking users. When submitting a sitemap to the search engines should i send one sitemap with links to all the different language versions or have one separate sitemap for each language. Is that even possible?

    Read the article

  • How do I get a screenshot of a given website using C#

    - by Ender
    I'm writing a specialised crawler and parser for internal use and I require the ability to take a screenshot of a web page in order to check what colours are being used throughout. The program will take in around ten web addresses and will save them as a bitmap image, from there I plan to use LockBits in order to create a list of the five most used colours within the image. To my knowledge it's the easiest way to get the colours used within a web page but if there is an easier way to do it please chime in with your suggestions. Anyway, I was going to use this program until I saw the price tag. I'm also fairly new to C#, having only used it for a few months. Can anyone provide me with a solution to my problem of taking a screenshot of a web page in order to extract the colour scheme? EDIT: Sorry for not getting back to this sooner, but I've been busy with some other things. Anyway, the code seems to work well, but the problem I am having right now is that I am running it within a form, and naturally with Application.Run() being called I cannot run two instances of the same form at once. It recommended Form.showDialog() but that broke everything. Can anyone give me a hand with this code? public static void buildScreenshotFromURL(string url) { int width = 800; int height = 600; using (WebBrowser browser = new WebBrowser()) { browser.Width = width; browser.Height = height; browser.ScrollBarsEnabled = true; // This will be called when the page finishes loading browser.DocumentCompleted += new System.Windows.Forms.WebBrowserDocumentCompletedEventHandler(OnDocumentCompleted); //browser.DocumentCompleted += OnDocumentCompleted; browser.Navigate(url); // This prevents the application from exiting until // Application.Exit is called // Application.Run() does not work as it cannot be called twice, recommended form.showDialog() // but still issues Application.Run(); } } public static void OnDocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e) { // Define size of thumbnail neded int thumbSize = 50; // Now that the page is loaded, save it to a bitmap WebBrowser browser = (WebBrowser)sender; using (Graphics graphics = browser.CreateGraphics()) { using (Bitmap bitmap = new Bitmap(browser.Width, browser.Height, graphics)) { Rectangle bounds = new Rectangle(0, 0, bitmap.Width, bitmap.Height); browser.DrawToBitmap(bitmap, bounds); Bitmap thumbBitmap = new Bitmap(bitmap.GetThumbnailImage(thumbSize, thumbSize, thumbCall, IntPtr.Zero)); thumbBitmap.Save("screenshot.png", ImageFormat.Png); handleImage(thumbBitmap); } }

    Read the article

  • speeding up website load using multiple servers/domains

    - by Mohammad
    When Yahoo! developer guide says "Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective". And as an explanation I read somewhere, that browsers will load up to 5 things simultaneously from the same domain. Would a subdomain, for example cdn.example.com be considered a new domain, in the previous statement?

    Read the article

  • IIS website is sending multiple content-type headers for zip files

    - by frankadelic
    We have a problem with an IIS5 server. When certain users/browsers click to download .zip files, binary gibberish text sometimes renders in the browser window. The desired behavior is for the file to either download or open with the associated zip application. Initially, we suspected that the wrong content-type header was set on the file. The IIS tech confirmed that .zip files were being served by IIS with the mime-type "application/x-zip-compressed". However, an inspection of the HTTP packets using Wireshark reveals that requests for zip files return two Content-Type headers. Content-Type: text/html; charset=UTF-8 Content-Type: application/x-zip-compressed Any idea why IIS is sending two content-type headers? This doesn't happen for regular HTML or images files. It does happen with ZIP and PDF. Is there a particular place we can ask the IIS tech to look? Or is there a configuration file we can examine?

    Read the article

  • Markup filter wanted for a public website

    - by sibidiba
    Developing a community site where everyone can post text, I'm looking for a markup filter: What is not part of the markup must be escaped (htmlspecialchars()) as it is. Should turn URL-s automatically into links Should support some form of basic markups (bold, image, url, pre, list) Should have a simple parser, that turns user input text into HTML Content on the site is public to everyone, XSS must not allowed to happen. What do you suggest? What markup language in the first place? BBCode? Wiki? Markdown? Are there any complete API-s with good examples? PHP is available on the server side. If there is a WYSIWYG-like texarea in addition (like here on SO) that would be a fantastic bonus!

    Read the article

  • Using Active Directory to authenticate users in a WWW facing website

    - by Basiclife
    Hi, I'm looking at starting a new web app which needs to be secure (if for no other reason than that we'll need PCI accreditation at some point). From previous experience working with PCI (on a domain), the preferred method is to use integrated windows authentication which is then passed all the way through the app to the database. This allows for better auditing as well as object-level permissions (ie an end user can't read the credit card table). There are advantages in that even if someone compromises the webserver, they won't be able to glean any additional information from the database. Also, the webserver isn't storing any database credentials (beyond perhaps a simple anonymous user with very few permissions) So, now I'm looking at the new web app which will be on the public internet. One suggestion is to have a Active Directory server and create windows accounts on the AD for each user of the site. These users will then be placed into the appropriate NT groups to decide which DB permissions they should have (and which pages they can access). ASP already provides the AD membership provider and role provider so this should be fairly simple to implement. There are a number of questions around this - Scalability, reliability, etc... and I was wondering if there is anyone out there with experience of this approach or, even better, some good reasons why to do it / not to do it. Any input appreciated Regards Basiclife

    Read the article

  • simple blog engine for website

    - by Alexander
    I know that there's a lot of free open source blog engine out there such as BlogEngine.NET. However that's an overkill for my purpose... I so far has created my own simple one by storing posts in a .xml file and so every time the main page loads it reads from all those xml files and displays it as posts. Now my problem is when a user clicks on a post title I want it to show on a new page(.aspx), so if the title is X then I want a new page called X.aspx when the user clicks on the title on the homepage. I hope this makes sense. My question is how do I create such thing?

    Read the article

  • Subscription website architecture questions + SQL Server & .NET

    - by chopps
    Hey Guys, I have a few questions about the architecture of a subscription service I am about to embark on and I am looking for some feedback on how best to set it up. I won’t have a large amount of customers as Basecamp, maybe a few hundred and was wondering what would be a solid architecture for setting up the customer sites. I’m running SQL Server and .NET on a dedicated machine. Should create a new database for each customer as to have control and isolation of data or keep them all in one database? I am also thinking of creating a sub-domain for each customer as well so modifications can be made to each site as needed. The customer URLs would look like this: https://customer1.foobar.com https://customer2.foobar.com I am going to have the ability to ‘plug-in’ reports that will be uploaded to the site so each customer can customize as needed. Off the top of my head this necessitates having each sub domain on its own code-base for the uploading of these reports. So on the main site the customer would sign up for their new subscription and I would programmatically create a new directory for the customer from the main code base and then create a sub domain pointing to the new directory for the customer and then finally their database. Does this sound about right? Am I on the right track? How do other such sites accomplish the same thing? Thanks for letting me bend your ear for a bit on this.

    Read the article

  • Website revision control system

    - by kylex
    I'm looking for the ability to use a revision control system for websites, but ALSO have the revisions go live immediately. Example: A developer submits to the repository, those changes are live immediately pulled from the repository. Any suggestions on available software?

    Read the article

  • Website url whitelists

    - by buggedcom
    I'm building a user content parser and am adding an automatic link parser. I'm adding a dialogue, that confirms that the user wants to go to the particular site being linked to. This is for two reasons. Anti phishing and spam combating. However I want to be able to disable both the dialogue and nofollow additions with commonly used websites, so I'm building a whitelist. Are there any common whitelists about or should I start building one from scratch?

    Read the article

  • .Net website create directory to remote server access denied

    - by tmfkmoney
    I have a web application that creates directories. The application works fine when creating a directory on the web server, however, it does not work when it tries to create a directory on our remote fileserver. The fileserver and the webserver are in the same domain. I have created a local user in our domain, "DOMAIN\aspnet". The local user is on both servers. I am running my .Net app pool under the domain user. I have also tried using windows impersonate in the web.config to run under the domain user. I have verified that the domain user has full control to the remote directory. In an effort to debug this I have also given the "everyone" full control to the remote directory. In an effort to debug this I have also added the domain user to the administrators group. I have a simple .net test page on the web server to test this. Through the test page I am able to read the directory on the file server and get a list of everything in it. I am not able to upload files or to create directories on the file server. Here's code that works: var path = @"\\fileserver\images\"; var di = new DirectoryInfo(path); foreach (var d in di.GetDirectories()) { Response.Write(d.Name); } Here's code that doesn't work: path = Path.Combine(path, "NewDirectory"); Directory.CreateDirectory(path); Here's the error I'm getting: Access to the path '\fileserver\images\NewDirectory' is denied. I'm pretty stuck on this. Any ideas?

    Read the article

  • Block specific IP block from my website in PHP

    - by iTayb
    I'd like, for example, block every IP from base 89.95 (89.95..). I don't have .htaccess files on my server, so I'll have to do it with PHP. if ($_SERVER['REMOTE_ADDR'] == "89.95.25.37") die(); Would block specific IP. How can I block entire IP blocks? Thank you very much.

    Read the article

  • how to add feedback form in Joomla 1.5.15 website

    - by Amzy
    hi, I am new to Joomla and using joomla1.5.15 hosted on godaddy. Currently no new/additional compenents/extentions have been installed. I want to add a feedback form (with few text/check boxes, radio buttons etc) in my site, which should also send an email to some email id without storing information in database. Kindly guide me how to do that? Is it reuiqred to install any components? If so, kindly suggest the non-commercial component(s). Thanks in advance. Thanks Amzy

    Read the article

  • Problem in integrating Wordpress blog's in Cakephp Website

    - by Nishant
    Hello Everyone, I was working on a site of Cakephp which was successfully delivered.But recently Client again appered and asked me to put the Wordpress blog in it,to cover up the Blogging thing in his site.He wants to share the authentication between the Cakephp and WP.Whoever registers in his site,then Logins in it and if he clicks on the Blog Tab,he must be redirected to the WP blog with the session still there.After some googling I have installed it in /app/webroot/blog folder but I am not able to edit the .htaccess file. Please help me in the right direction,that how to share the authentication betwenn Cake Php and Wordpress, and the second one how to customize the .htaccess file so that URL's look good. Thanks in advance..!

    Read the article

  • make website page get news articles from feeds

    - by Andy
    Hi, I would like to start publishing some news articles from time to time. An option would be to create a new web page for every article but this sounds like it can be done easier. For instance I noticed clear channel uses some kind of feed, like: http://www.z100.com/cc-common/news/sections/newsarticle.html?feed=104650&article=7011404 I don't know what it means and how it works but looks like a nice way to get the article from some kind of database I guess? Does someone knows how this works, or some other way to create new articles etc, some dynamical functionality for instance to generate links to the articles and get the right article from a database for example. thanks!

    Read the article

  • What movie website allows people to scrape it?

    - by Sergio Tapia
    I've wanted to make a C# library to scrape movie information and return it to the application, but someone told me that it's against the TOS. RottenTomatoes seems to have no problems with it from what I've read on their licensing page, but I'm not quite sure. Where could I aquire movie information legally and without cost? It's for an open source application hosted here: LINK

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >