Search Results

Search found 4618 results on 185 pages for 'websites'.

Page 45/185 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • How to talk a client out of a Flash website?

    - by bunglestink
    I have recently been doing a bunch of web side projects through word of mouth recommendations only. Although I am much more a of a programmer than a designer by any means, my design skills are not terrible, and do not hate dealing with UI like many programmers. As a result, I find myself lured into a bunch of side projects where aside from a minimal back end for content administration, most of the programming is on front end interfaces (read javascript/css). By far the biggest frustration I have had is convincing clients that they do not want Flash. Aside the fact that I really do not enjoy Flash "development", there are many practical reasons why Flash is not desirable (lack of compatibility across devices, decreased client accessibility, plug-in requirements, increased development time, etc.). Instead of just flat out telling the clients "I will not build you a flash website", I would much rather use tactics to convince/explain to them that this is not what they actually want, ie: meet their requirements any better than standard html/css/js and distract users from their content. What kind of first hand experience do others have with this? How do you explain to someone that javascript/css/AJAX is usually a better option for most websites? Why do people want to use Flash so bad to begin with? This question pertains to clients who do not have any technical reasons for wanting flash, but just want it because they think it makes pretty websites.

    Read the article

  • Deploy code to Windows Azure from Dropbox

    - by Gopinath
    There is a lot of innovation happening at Windows Azure team these days under the leadership of ScottGu. The recent updates to Windows Azure published couple of days ago allows us to deploy code to Windows Azure websites straight from Dropbox. It’s very easy and simple to use. Authorize Windows Azure account to talk to Dropbox and whenever you want to deploy latest code from Dropbox just click it button. Boom! The latest code from Dropbox will be automatically deployed on Windows Azure. Everything works like magic. Wow, isn’t this a cool feature for those who don’t want to maintain their version control systems like Git, Svn or TFS? This is a big deal to many developers who maintain their personal websites source code on Dropbox. Wondering why developers maintain their source code in Dropbox? It’s easy to use Dropbox(zero learning curve) and setting up a source control systems demands lot of administrative activities as well as money for hosting them. Here is a quick walk through of deploying code to Windows Azure from Dropbox. Though I’m not going to user this feature for deploying code of my website coziie.com (I’ve a personal SVN server hosted), I’m going to recommend to all my friends who maintain their source code on Dropbox.  For more details read the detailed post on ScottGu’s blog.

    Read the article

  • Link to article on website libraries

    - by acidzombie24
    I just started another website and it has taken me 30mins to copy/paste my other website and delete stuff because I don't have a template. Theres lots of features I copied over that I haven't seen in libraries/templates. But I don't really know any libraries/templates. This site is ASP.NET. Some things I have is a string.format that escapes strings for HTML (so <hi> is text instead of a tag). Other features are adding or removing items in the url query, a class to pass in a ASP.NET error and log or convert it into a row in a db (I know about elmah but during development on my last site it wasn't Mono compatible), a mini AJAX library for success/fail/redirect/etc, a class to pass in a ASP.NET error and log or convert it into a row in a db and anything else I would use in every site. I don't like my (library) design because I wasn't expecting to do more then 2-3 websites and I am on my 5th. I don't know proper ASP.NET either so what is an article that explains how to make a great library/template for websites?

    Read the article

  • I'm confused about encryption and SSL

    - by ChowKiko
    while my friends and I planning to run our own website, we're confused about the encryption where hackers can TAPPING or in social engineering it's WIRE TAPPING, but I don't know how do they call it in Computers today... Well guys, I just want to know how encryption works with websites if we are using PHP+MYSQL? Is it ok to use user login ---- (PHP) encrypt inputted value then (PHP) will decrypt and validate it going to (MySQL) user login ---- (PHP) encrypt inputted value and decrypt the (MySQL) data if they are similar... Is it similar if we use $_SESSION without encryption inside PHP going to MySQL?or PHP encryption also helps the manipulation of binaries?..I'm so confused T_T... In regards to what I stated above, can a hacker hook the data if the server uses $_SESSION? Is $_SESSION safe?... IF THE HACKER CAN HOOK it? is it necessary to use SSL on our website? and why do some Merchandise websites use SSL and likewise facebook also uses SSL? what is the best suit for you if there is no SSL? encrypting the DATA using PHP going to MySQL or even without encryption while the PHP server uses $_SESSION?...

    Read the article

  • Webmaster Tools, www and no-www, duplicate content and subdomains

    - by Jay
    I have not come to any conclusive answers on the following after many hours of research on many websites to the specific issue that I am trying to figure out. My company has two websites a main one at www.example.com and one at subdomain.example.com which is a subdomain of the first and is our self hosted blog. The way Google sees these with the www or no-www (called naked for now on) is that each of these actually are different when the www or naked version is used/not used in the front of the domain. I completely understand this. It is also advised that both should be set up in the Google Webmaster Tools, which I have done. Correct me if I am wrong on that in regard to having both set up. Now the way it appears is that we can set a preferred domain up in Webmaster Tools only at the root domain level. The subdomain can not have this and actually says the following "Restricted to root level domains only". So it appears that the domain should follow what the root domain says which on our preferred one says to display the www.example.com. and not the naked version. That is one issue I have in that one displays one way and the other displays another. Is it that we have the wrong redirects in place for the subdomain? Another question is does this have any affect on SEO in regards to duplicate content on the web in how we have set this up?

    Read the article

  • Had anybody earned $0.25+ from each of a captcha (on your website) passing?

    - by vgv8
    I am a real dummy in web monetizing schemes. [ 1 ] informs: "Solve [Media] charges a fee of about 25 cents to 50 cents for each form that is filled out using a Type-In ad [captcha]... the company splits its fees 50-50 with the websites where the ads are placed" Honestly, I cannot imagine that someone (in its proper senses) pasy that much money for just one captcha passed. And how to understand these claims? http://www.solvemedia.com/images/ie9_aboutcalcount.png shows: Why would Microsot pay 0.25-0.5 USD for each entered string "Be part of the Beta"? Has any of webmasters (sysadmins) got those from deployed SolveMedia captchas on their websites? Is it scam? Because if to check the sites mentioned in http://www.solvemedia.com/gallery.html, that is, for ex., http://www.toyotanation.com/forum/register.php?do=register, the latter do not have such captchas. What do I miss? Cited: [ 1 ] Jennifer Valentino-DeVries "An Online Ad That’s Tough to Ignore" WallStreet Journal Blog SEPTEMBER 20, 2010 http://blogs.wsj.com/digits/2010/09/20/an-online-ad-thats-tough-to-ignore/

    Read the article

  • Domain forwarding to a IE "trusted site" opens a blank page

    - by Michael Jasper
    My employer, a University, regularly hosts conferences and other events. While websites for these sites are hosted on our domain, they frequently request customized .com urls. We then forward these domains to the specific site. Recently, we discovered a problem, where a page will not load if the following conditions are met(using a current example): website is created on our CMS for a conference http://continue.weber.edu/nulc a domain is created http://www.nulc2012.com and forwarded to http://continue.weber.edu/nulc The user enters http://www.nulc2012.com into their address bar using IE7 or IE8 The user has *.weber.edu listed as a "trusted site" in IE security settings (the case for nearly all on-campus computers) When this happens, their browser will correctly transfer to the page http://continue.weber.edu/nulc/index.php, however the page is blank, returning only: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <HTML><HEAD> <META content="text/html; charset=windows-1252" http-equiv=Content-Type></HEAD> <BODY></BODY></HTML> Is there any know solution to this problem? Or am I missing something completely? Note: Tested websites do load correctly in Chrome, Firefox, and Safari

    Read the article

  • Continuous integration never results in build errors

    - by Jon
    Hi, I'm working with a variety of Java EE websites which use internal libraries we've developed. For each website, we only upgrade to new versions of our internal libraries as needed, and before committing we make sure that the site compiles fine. What this means is that when TeamCity does a build of one of our sites, the site compiles fine, but later when the site is updated to the latest version of internal libraries, there might be a compile error. Is there a good way to handle this? We're not using Maven yet; would using Maven mean that our websites could automatically use the latest version of internal libraries? Thanks. Clarification: What we sometimes run into is this: Project A depends on a library, and is currently using library version 1.0 Project B also depends on that library. I make changes to the library so that it is now version 1.5. Project B now uses 1.5. Project A and project B have both been built just fine by the CI server (TeamCity) Working on project A again, I update to 1.5 and discover that 1.5 has breaking changes in it. Is there a way for the CI server to discover these kinds of breaking changes?

    Read the article

  • Creating a Website Without a Framework [closed]

    - by James Jeffery
    I've been using PHP Frameworks for so long that I've actually forgot the "best practices" for create websites without one. Usually I will use Symfony, or more recently I've been using Laravel. A client wants a very simple website, but with certain parts of it dynamic. Due to the nature of the site using Wordpress, or a Framework, is out of the question. I'm a sucker for priding myself on my code, but I feel like I'm asking such a basic question that it's killing me to ask. But, what are the best practices for creating websites without a Framework? I like to live by the K.I.S.S (Keep It Simple Stupid!) method of thinking. So, my idea was to just create the .php pages that are required, do any page processing or database interaction on that page, then have the HTML below the closing PHP tag. I would have any helpers/functions in a functions.php file. This is what I remember doing way before I was using Frameworks, and to me it seems like a very old school way of doing things. I've not created a site without a Framework for literally 2+ years, so I've lost my way with the basics. Any advice would be greatly appreciated.

    Read the article

  • What kind of website or coding is suitable and safe for an artist's website

    - by Dan S
    I have a web design project that is related to a singer, and I used Joomla for my previous project and designed good music websites. But for this project I cannot find a suitable template to edit and use. As the website is so simple and does not have any special functionality, I'm thinking about creating a website with just simple CSS, html and jQuery. I'm Good at them and can make a perfect look but I am not sure about the security. In Joomla I use different security plugins but do not know about a client-side scripting. So generally I need your ideas, about the following questions: - Is Joomla and generally CMS a good option for a music website? - How famous artists' website is base on? CMS or Client-side scripting? - Do you recommend to create it manually without using and CMS or template? - An do you suggest WordPress for this type of websites? (The website will have these pages: Biography, News, Music (with a music player), Photos, videos and contacts). That's it! Thank you for all your responds, I had a look at Joomla and the only template I chose is This One which seems very simple, and I am worry about module position, because it seems does not have any module position at all. I tried to contact the provider but did not get any respond. Does anyone know about its module position, I mean is there any way to find them? An is it possible to create a 2-3 module positions? Also I had a look at ThemeForest's WordPress templates and it has such a great template. I think WordPress is more active in creating artistic templates. But is it secure and professional to use this CMS for a singer who is kinda famous it his country? I am talking about a template like this. Share your opinions guys.

    Read the article

  • Google affecting my SERP Rank?

    - by Asad Moeen
    The following are some of my website's details. Home-page: [thebluewaffles].[com] Keywords: Blue Waffles- Rest of the keywords are post/subject specific. Site Description: Health Articles Blog Site Age: 1.5 years A short history: When I started my website, the few things in my mind when posting content were at-least 500 words on each page and writing of all the articles with to the point information. I didn't go really fast with it which is why I only have about 15 articles in 1.5 years. The SEO strategy was more simple. I shared links through Social Marketing websites and some Article Sharing websites after which I could see my website's rankings in top 5 SERP results. I ranked good enough for about 8 months continuously but didn't keep updating content due to which there were some 3 rough months when no content was posted due to some personal work. The SERPS dropped to 2nd page in April and almost started disappearing in May. I asked a lot of people about it and most came up with the reason of "no updates to site" so I started updating my site again since the day, November has almost started and I see no signs of my website's ranking. Another important point is that when I post a new article, and do a title search in Google, I see it ranks good enough for the first 10 hours and then disappears. What could be wrong here?

    Read the article

  • Travelocity Delivers Superior Customer Experience and Reduces Operating Costs with Oracle RightNow

    - by Tony Berk
    Turning the spotlight to our newest member of the CRM and Customer Experience (CX) family, RightNow, we highlight one of many customer success stories.  Travelocity is a leading provider of consumer-direct travel services for the leisure and business traveler. It markets and distributes travel-related products and services directly to individuals through Travelocity and its various brand websites and contact centers, and websites owned by its supplier and distribution partners. Before RightNow, Travelocity was running one system for its agent desktop and a separate email solution. Toggling between systems was inefficient and cumbersome. The RightNow contact center solution enables Travelocity to react at a moment’s notice and get customers the information they need before, during, and after their trip while maximizing agent productivity and driving revenue. Superior customer experience is one key reason why Travelocity continues to be a leader in the industry. The RightNow contact center solution supports Travelocity across its global brands with multi-channel support to provide superior care however customers communicate with the company—via phone, email, web, chat or mobile. Click here to learn more about Travelocity's use of Oracle RightNow and review other RightNow success stories.

    Read the article

  • Develop and Use Applications with MySQL and PHP

    - by Antoinette O'Sullivan
    Want to develop and use applications with PHP and the MySQL database? Consider taking the MySQL and PHP: Developing Dynamic Web Applications training course. Before taking this course you should: Understand how HTML files are assembled Understand fundamental PHP syntax Have some programming experience (preferably PHP) Have some experience with relational databases Have some knowledge of Object-Oriented Programming This 4-day live, instructor-led course is perfect for developers who use PHP and MySQL to build and maintain their websites and who want to learn how PHP and MySQL can be used to rapidly prototype and deploy dynamic websites. You can take this course as a: Live-virtual event: Take this event from your own desk, no travel required, choosing from a selection of virtual events already on the schedule. In-class event: Travel to an education center to take this course. Below is a selection of events already on the schedule.  Location  Date  Delivery Language  Jakarta, Indonesia  3 December 2013 English   Rome, Italy  5 May 2014 Italian   Turin, Italy 17 March 2014  Italian   Warsaw, Poland 12 November 2013  Polish   Madrid, Spain  16 December 2013  Spanish  Tunis, Tunisia 17 March 2014  French For more information on the authentic MySQL curriculum, go to http://oracle.com/education/mysql.

    Read the article

  • First Project a big one, How much should we charge?

    - by confuzzled
    Two of my cousins and I started a freelance computer repair/web design business just to make some money on the side during college, and received our first major web design project about three weeks ago. Now we've created websites before, but it was mostly for family businesses and have never really charged money, and most of the websites have been static, and don't really require a CMS. This project, however, was a big one (for us anyways). We created a news site that had several categories, we created the banners, we created a classifieds page (not a web app just something static that they control). Several links, a few graphical assets, CSS drop down menu, RSS feed from a different news site, weather, all the normal stuff you would find on a regular news site. On top of that we put in all the usual Joomla stuff (search, Jcomments, Jslide pictures, JCE, etc.). Then we uploaded the first 10 articles they gave us, and we are going to train them how to use Joomla. Now, at first we decided for 700 dollars. I assumed they just wanted a simple blog like website where they can upload articles. But then we had a meeting, and they asked for a lot more. Note: we did not hard code the template from scratch, but customized the gantry framework to fit their needs. We did code quite a bit however. I estimate that we put in about 50-60 hours in total. I'm wondering if 700 dollars is a bit low, this price is definitely not set in stone. Please keep in mind that this is our first project, and we are newbies, please be kind. Thank You!

    Read the article

  • Session Tracking - Advantages and Disadvantages

    I use to work for a major internet company that sold Dental Plans to customers through their large customer-driven websiteto consumers. They start tracking their users as soon as they hit their web servers, and then they log everything they can about the user. There are a lot of benefits for using session tracking for both the user and the website. Users can benefit from session tracking due to the fact that a website can retain pertaining information for the user so that they do not have to re-enter the same information repeatedly. In addition, websites can hold specific items in a cart for each user so that they can pay for all of their  items at once when they are ready to complete their purchases. Websites can also benefit from session tracking because they can determine where a specific user came from and which advertising partner gave them a sale. This information is very useful when deciding on where to spend an advertising budget. There is only one real disadvantage when it comes to session tracking, Users can not really control what is actually tracked by a website. Yes, they can disable cookies and this will help, but that means that no tracking can be done at all. Most sites require users to have cookies enabled in order for users to make purchases or login to their accounts.  

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Glue Records creation

    - by FFrewin
    I need some information on the following issue, as I would like to have it clear on my mind. I have a VPS server. All my sites hosted on this VPS are using as NameServer .gr domain, like ns1.greekdomain.gr & ns2.greekdomain.gr . The .gr domain name is a domain I own with a greek registar. Now, I want to move 2 websites with .co.uk domain names to my VPS. The co.uk domain names are registered with a UK based registar. When I went in the domain management panel, I did changed the nameservers of my domains to my ns.greekdomain.gr ns. However the panel returns an error about invalid nameservers. After digging, I found that my nameservers are not valid because they do not exist as records in the .co.uk registry. And here it starts my big trouble. The .co.uk registart tells me that I have to ask my hosting provider / .gr registar to create a new record to the .uk registry for my nameservers. The .gr registar tells me that my uk registar needs to create a new record for my ns. From Nominet (.co.uk) registry, the one employee tells me that I need to ask my uk registar, the other employee (seemed to not understand what I was asking) told me that they cannot change my nameservers for me, and she told me to contact anyone else (old hosting provider, uk registar, .gr registar) to help me with that. I can't find help from nobody. I try since the last week to transfer my websites to my VPS and I can't. So, the question is who is responsible and who is able to create glue records for my nameservers ?

    Read the article

  • Any danger in using the Wine workaround in 12.04?

    - by TrailRider
    To run certain Windows programs in WINE you need to this workaround: echo 0|sudo tee /proc/sys/kernel/yama/ptrace_scope According to the support websites, this is due to a bug in the Ubuntu kernel that prevents ptrace and WINE playing well together. Using the above command you set the ptrace to 0 which according the research I've done(don't ask me which websites, I have seem a lot of them), ptrace has to do with the interactions between programs. The 0 setting is more permissive than the 1. I have to assume that there was a good reason Ubuntu wanted the ptrace=1 so this leads me back to the short form of the question. Are there any risks involved in setting ptrace=0. Lower security? problems debugging? any others that I haven't thought of??? P.S. for anybody reading this that wonders what the bug causes, the Windows programs will fail to open at all, in the System Monitor you will see many instances of the program trying to open and then they will eventually all quit and if you run the progam for the terminal you will get an error that tells you that the maximum number of program instances has been reached.

    Read the article

  • Yelp, Google's API for restaurants help

    - by chris
    Ok I have looked into this, and I'm not sure if anyone else has experience with it. I'm having termendous difficulties with Yelp and Google's API. To help explain what I am trying to do here is the concept of the website. We would have to pull restaurants based on user distance, and then randomize them based on quality of restaurant based on feedback from review websites (Yelp, Google, urbanspoon, zagat, opentable, kudzu, yahoo - doesn't have to be from all), and feedback from our users (on results page for the random restaurant users can select good recommendation/bad recommendation). There’s a lot we could calculate for our formula. Things that will dictate your results will be based on if you’re at home or work. If you’re at home you will have more time to drive out to the city to grab some dinner or lunch. If you’re at work we would have to recommend restaurants nearby as lunch is typically 30 minutes to a hour. A 30 minute lunch would require take out most likely or quick service. A hour lunch break you could dine in at a local fine dining restaurant. So in a nutshell, user comes to website. Select if they're at home or work, click submit and we will have a random restaurant selected for them to go. If they don't like it they can click retry and a new restaurant can show. The issue I am having is using the API to gather all the restaurants in the US. I know it can be done because there are similiar websites/apps that pull restaurants that are closest to you such as Ness, Alfred, and I believe there's two more but I can't remember the names. Anyone know if this can be accomplish?

    Read the article

  • Enterprise Portal Issue with the Ax Demo VPCs

    - by ssmantha
    Microsoft’s Ax Demo VPC is basically configured for a static IP address 192.168.0.1, this is due to the fact that the VPC has Domain Controlller configured in it which requires a static IP. When we put this VPC on a network with a different subnet and change the IP you can observer that the site http://sharepoint and http://sharepoint/EP cease to function and show “Page Not Found” errors in the browser. This is mainly due to the DNS configuration which is not updated. Below is the screen shot of the changes that needs to be done to make the site functioning properly. Change the following entries in the Forward Lookup Zones of DNS management: These websites default, SharePoint and projectserver are all mapped to a single port in the IIS i.e. port number 80. These websites are recognised with host headers. These host headers are configured in DNS with incorrect IP address entries in DNS when you change the IP address of the VPC. Just change these values to point to the Local Loop Adapter (127.0.0.1) and change the DNS to point to this address in the TCP/IP properties as shown below: This will resolve the issue with the website rendering. Initially you may get time out errors while browsing these website. be patient and try again this would work.

    Read the article

  • Photos being copied all over the place

    - by plua
    We have a rather popular website with plenty of photos. Our whole business depends on our content - and the photos are important in this. We invest a lot of time, effort, and money into taking these pictures. On our website we have clear copyright notices, we have the website name and logo in the photos, and we have a Photo Licensing page which states the prices of licensing our photos. Despite all this, our photos are copied by personal and commercial websites alike. We really want to do something about this. We do not want them to take out the photos and leave it at that. We want them to pay for the usage, as we clearly state on our website. Now a few questions come to mind: Can we legally force them to pay right away? Or are we obligated to first write a "Cease and Desist" letter? Photos are used on websites throughout the world. Are there any worldwide rules for this? Has anybody experience with doing these things outside of their home country? Should we hire a lawyer in any country? Or could a local lawyer contact oversees companies directly?

    Read the article

  • Mozilla Persona to the login rescue?

    - by Matt Watson
    A lot of website now allow us to login or create accounts via OAuth or OpenID. We can use our Facebook, Twitter, Google, Windows Live account and others. The problem with a lot of these is we have to have allow the websites to then have access to our account and profile data that they shouldn't really have. Below is a Twitter authorization screen for example when signing in via Technorati. Now Technorati can follow new people, update my profile and post tweets? All I wanted to do was login to Technorati.com to comment on a post!Mozilla has just released their new solution for this called Persona. First thought is oh great another solution! But they are actually providing something a little different and better. It is based on an email address and isn't linked to anything like our personal social networks or their information. Persona only exists to help with logging in to websites. No loose strings attached.Persona is based on a new standard called BrowserID and you can read more about it here:How BrowserID Works.  The goal is to integrate BrowserID in to the browser at a deeper level so no password entry is required at all. You can tell your web browser to just auto sign in for you. I am really hoping this takes off and will look at implementing it in current projects! I would recommend researching it and lets hope it or something like it becomes a wide spread reality in the future.

    Read the article

  • Using .htaccess, can you hide the true URL?

    - by Richard Muthwill
    So I have a web hotel with 1 main website http://www.myrootsite.com/ and a few websites in subdirectories, in a folder called projects. I have domain names pointing to the subdirectories, but when holding the mouse over a link in those websites the URLs are shown as: http://www.myrootsite.com/projects/mysubsite/contact.html When I'm on mysubsite.com I want them to be shown as: http://www.mysubsite.com/contact.html I spoke to support for the web hotel and the guy said try using .htaccess, but I'm not sure exactly how to do this. Thank you very much for your time! Edit: For more information My website is: http://www.example1.com/ and I also own http://www.example2.com/. All of example2.com's files are in: example1.com/projects/example2/. When you visit example2.com, you'll notice all of the URL's point towards: example1.com/projects/example2/ but I want them to point towards: example2.com/ Can this be done? I hope this is enough info for you to go on :). Edit: For w3d I go to the url mysubsite.com and the browser shows the url mysubsite.com. The services I'm using create an iframe around myrootsite.com and use the url mysubsite.com I just hate that in Firefox and Internet Explorer, holding the mouse over link show that the destination url is: myrootsite.com/projects/mysubsite/...

    Read the article

  • Users can benefit from Session Tracking

    I use to work for a large Dental Plan marketing website a few years ago and they had a large customer-driven website that sold Dental Plans to consumers. Their website started tracking users as soon as they hit their web servers, and then they logged everything they could about the user. There are a lot of benefits for using session tracking for both the user and the website. Users can benefit from session tracking due to the fact that a website can retain pertaining information for the user so that they do not have to re-enter the same information repeatedly. In addition, websites can hold specific items in a cart for each user so that they can pay for all of their  items at once when they are ready to complete their purchases. Websites can also benefit from session tracking because they can determine where a specific user came from and which advertising partner gave them a sale. This information is very useful when deciding on where to spend an advertising budget. There is only one real disadvantage when it comes to session tracking, Users can not really control what is actually tracked by a website. Yes, they can disable cookies and this will help, but that means that no tracking can be done at all. Most sites require users to have cookies enabled in order for users to make purchases or login to their accounts.

    Read the article

  • Storing images in file system and returning URLs or virtually resizing and returning byte arrays?

    - by ismaelf
    I need to create a REST web service to manage user submitted images and displaying them all in a website. There are multiple websites that are going to use this service to manage and display images. The requirements are to have 5 pre-defined image sizes available. The 2 options I see are the following: The web service will create the 5 images, store them in the file system and and store the URL's in the database when the user submits the image. When the image is requested, the web service will return an array of URLs. I see this option to be a little hard on the hard drive. The estimates are 10,000 users per site, and lets say, 100 sites. The heavy processing will be done when the user submits the image and each image is going to be pulled from the File System. The web service will store just the image that the user submits in the file system and it's URL in the database. When the user request images, the web service will get the info from the DB, load the image on memory, create its 5 instances and return an object with 5 image arrays (I will probably cache the arrays). This option is harder on the processor and memory. The heavy processing will be done when the images get requested. A plus I see for option 2 is that it will give me the option to rewrite the URL of the image and make them site dependent (prettier) than having a image repository for all websites. But this is not a big deal. What do you think of these options? Do you have any other suggestions?

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >