Search Results

Search found 46511 results on 1861 pages for 'mark of the web'.

Page 211/1861 | < Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >

  • Can resizing images with css be good?

    - by Echo
    After reading Is CSS resizing of images still a bad idea?, I thought of a similar question. (too similar? should this be closed?) Lets say you need to use 10 different product image sizes throughout your website and you have 20k-30k different product images, should you use 10 different files for each image size? or maybe 5 different files and use css to resize the other 5? Would there ever be combination that would be good? Or should you always make separate image files? If you use css to resize them, you will save on storage (in GBs) but you will have slight increase in bandwidth and slower loading images(but if images are cached, and you show both sizes of the image would you use less bandwidth and have faster loads?) (But of course you wouldn't want to use css to resize images for mobile sites.)

    Read the article

  • Here we go again - quest for web hosted forum via javascript

    - by jim
    Hello all, disclaimer If this is the wrong location for this question, then please advise me accordingly. backgound I've been using Disqus and intense debate as a 'comments' service for a variety of my sites to great effect and love the fact that i get alot of the facebook/twitter integration 'for free', as well as the SEO benefits. request To this end, does anyone out there know of similar services that can be used to pull entire forums/threaded discussions into the app in a similar fashion (i.e. via ajax webservices). google has been at a loss to turn anything up on this front and i'm therefore wondeing if it's unlikely that such a 'service' exists. respect hope this stikes a chord out there... btw - altho using this in asp.net mvc, I'm aware that this technology could be used on any platform capable of consuming javascript via ajax, thus the wide spread of 'tags'.

    Read the article

  • DotNetNuke Development - The Right Tool For Web Development Today

    With the emergence of websites as the one of the primary modes of communication on the Internet, many tools have been developed to assist in creating sites that are capable of meeting the highest expectations of their visitors. This article discusses DotNetNuke development for developing sites for the new generation of visitors on the Internet.

    Read the article

  • CRM + Invoicing/Billing + Ticketing for a small web design company

    - by Mike
    Hi everyone, I am currently using ActiveCollab but it lacks the typical CRM features. I can't even keep notes about a customer saved in one place. What I am looking for is a simple but efficient CRM application that allows me to store all the (potential) customers along with their phone calls noted down, contracts, agreements. On the billing end, I should be able to keep track of invoices and payments, along with a bit of sales reports. A great extra would be a ticket support feature but not really necessary I looked at VTiger and SugarCRM at first. Though, they look too complex on the sales/campaigns end but completely lack the billing side. Do you have some good apps/services to suggest? :) Any programming language or OS would do. Both paid and free. Thanks Mike

    Read the article

  • Javascript : Modifying parent element from child block the web site to display

    - by Suresh Behera
    Well recently i was working with Dotnetnuke and we are using lots of JavaScript around this project. Internally, dotnetnuke use lot of asp.net user control which lead to have a situation where child element accessing/modifying data of parent. Here is one example   the DIV element is a child container element. The SCRIPT block inside the DIV element tries to modify the BODY element. The BODY element is the unclosed parent container of the DIV element. 1: < html > 2: < body >...(read more)

    Read the article

  • What's the best approach to Facebook integration?

    - by Jay Stevens
    I have a new site/app going live next week (or somewhere close). I know there will be a relatively small (15,000?) very dedicated group of people on Facebook who will be very likely to be interested in the site, so I know I need Facebook integration of some kind. I won't be doing Facebook logins or pulling/posting to profiles yet, but I plan to... The question: Do I just do a Facebook "Page" for now? This is faster/easier to set up and seems a little less buggy.. and then migrate to a Facebook App later? or Do I create a "Facebook App" (with the api key/id/secret, etc.) now even if I'm doing nothing but using the "like" button. This means I don't have any migration later and I can use the javascript api to log "like" button clicks to Google Analytics, etc. Thoughts? Experiences? Is there a migration process to move your old Page users to your new "App"? What's the advantages / disadvantages of each.

    Read the article

  • wordpress sites are slow on shared hosting but plain html/css sites are fast

    - by sam
    ive got a shared hosting account, unlimited sites, unlimited gb, unlimited bandwidth ect ect. Of course because its shared and a cheap one at that theres too many sites on each server and it all runs slow due to lack of ram. What ive found is that my plain html/css/js sites run an awful lot faster than my wordpress sites on this hosting and i was trying to work out why. Im not exactly sure how a browser sends a request for a page and the full process of request and delivery, but are my html sites running faster as they are just serving code to the browser, where as the wordpress sites are having to make calculations from the database to make each page before its delivered .. is that correct, or am i completly off course ?

    Read the article

  • Is the term "web portal" obselete?

    - by John Hamelink
    Firstly, sorry if this is the wrong place: I've looked at all the programming-related boards and this one seems like the best fit - correct me if I'm wrong. My boss uses the term "portal" for the project I work on all the time. To me, the word makes me think of Yahoo in the late 90s. Does the word "portal" have old-school connotations, or is it just me? Do you think it's ok to use it, and not drag our client's perception of the product down into the middle-ages? Or again, am I just being weird?

    Read the article

  • Using ASP.NET C# and Javascript

    - by ctck
    I'm looking for the most efficient / standardized way of passing data between client javascript code and C# code behind in an ASP.NET application. Currently ive been using the following methods to achieve this but they all feel a bit like a fudge. The way i pass data from javascript to the C# code behind is by setting hidden asp variables and triggering a postback <asp:HiddenField ID="RandomList" runat="server" /> function SetDataField(data) { document.getElementById('<%=RandomList.ClientID%>').value = data; } Then in C# code i collect the list protected void GetData(object sender, EventArgs e) { var _list = RandomList.value; } Going back the other way i often use either scriptmanager to register a function and pass it data during Page_Load: ScriptManager.RegisterStartupScript(this.GetType(), "Set","get("Test();",true); or i add attributes to controls before a post back or during Initialization / pre rendering stages: Btn.Attributes.Add("onclick", "DisplayMessage("Hello");"); These methods have served me well and do the job. However they just dont feel complete. Is there a more standardized way of passing data between client side markup / javascript and backend code. Ive seen some posts like this one: Injecting JavaScrip : StackOverflow that describe HtmlElement class. Is this something is should look into? Thanks everyone for your time.

    Read the article

  • JSF 2.2 recent progress - Early Draft

    - by alexismp
    JSF specification lead Ed Burns has an update on the progress of JSF 2.2, another component which should be required as part of the upcoming Java EE 7 standard. This includes a reminder of the scope of this specification, the availability of the early draft and height specific features that are being worked on and split into "Mostly Specified Features" and "Not Yet Fully Specified Features" (I think you can read the latter as "at risk"). My favorite is "763-EverythingIsInjectable". Remember that JSF 2.2 is due out in the middle of 2012 which is in time to be integrated in the Java EE 7 platform JSR (currently scheduled for second half of 2012). In the mean time, JSF 2.2 nightly builds are available.

    Read the article

  • Web developing- Strange happenings

    - by Jason
    As I'm teaching myself PHP and MySQL during break, I'm experimenting coding in a Ubuntu virtual machine where Apache, MySQL and PHP have been installed and configured to a shared folder. I'm not a big fan of Kompozer because the source code layout is a PIA, so I've started checking out gPHPEdit. However, since using it, I've come across two issues: when I edit the .html and .php files, sometimes the file extension will change to .html~ and .php~, becoming invisible to the browser. The only solution is to switch to Windows, right click and rename the file extension. In Ubuntu Firefox, when I click on my prpject's Submit button for in a practice form, a dialog box pops up asking what Firefox should do with the .php file, rather than simply displaying it in the browser. When I do this in Windows Chrome & Firefox, it goes right to the response page. I'm not sure if this behavior is limited to gPHPEdit/Kompozer, but I've never noticed this happening in Dreamweaver. Any solutions? EDIT The behavior in Point 1 occurs both when Dreamweaver is open in Windows accessing the same files and when it is not. I changed the extension filename of welcome.php, added a comment in gPHPEdit, and the file changed to welcome.php~ upon saving.

    Read the article

  • Why deny access to website for msnbot/bingbot?

    - by Quandary
    I've seen quite a lot of tutorials that recommend you to ban user agents containing the strings libwww-perl and msnbot. I understand why one would ban libwww-perl, it's mainly if not only used for hacking and spamming. But why are there so many sites recommending to ban msnbot/bingbot? Since it's a search engine, even if only with a marginal market share, I would except one would want this bot to crawl one's sites. What is it that msnbot does that makes people ban it?

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • DotNetNuke Development - The Right Tool For Web Development Today

    With the emergence of websites as the one of the primary modes of communication on the Internet, many tools have been developed to assist in creating sites that are capable of meeting the highest expectations of their visitors. This article discusses DotNetNuke development for developing sites for the new generation of visitors on the Internet.

    Read the article

  • Software Design for Product Verticals and Service Verticals

    - by Rachel
    In every industry there are two verticals Product Vertical and Service Vertical, so my question is: How does design approach changes while designing Software for Product Vertical as compared to developing Software for Service Vertical ? What are the pros and cons for each case ? Also, in case of Product Vertical, How you go about designing Product or Features and what are steps involved ? Lastly, I was reading How Facebook Ships Code article and it appears that Product Managers have very little influence on how Product is developed and responsibility lies mainly with the Developer for the feature. So is this good practice and why one would go for this approach ? What would be your comment on this kind of approach ?

    Read the article

  • What bots are really worth letting onto a site?

    - by blunders
    Having written a number of bots, and seen the massive amounts of random bots that happen to crawl a site, I am wondering if the goal of the site allowing bots is for the potential for the bot to send real traffic back to the site if there is any reason to allow bots that are not known to be sending real traffic back, and how to spot these "good" bots; based on how they ID themselves, IPs they come from, behaviors, etc.

    Read the article

  • Java or Python for internet application?

    - by jpartogi
    In choosing a technology for internet applications where the number of users may scale over time, which one should we consider: Java or Python? What are the considerations in choosing one and not the other? If speed and scalability is our main criteria, which one should we use? We have looked around and it seems that there are more websites that use Python [i.e : Quora, digg, reddit, bitbucket and disqus] than Java. Based on that, can we say that Python is more suitable for internet applications where speed and scalability is the main criteria? However we have browsed around and found some comments saying that Java is actually faster than Python. Thank you for your insights.

    Read the article

  • SSRS/Sharepoint - Reports made in Report Builder not being list in Sharepoint Web Part

    - by Greg_the_Ant
    I followed the steps here to integrate reporting services with sharepoint in native mode. I made a page in Sharepoint with the report explorer web part and everything is working. The issue is when I create a report with the web based report builder tool, it will show up in the report manager page, but not show up in the report explorer web part on the share point page. New reports I upload using report manager do show up. Does anyone have any ideas? I'm really stuck.

    Read the article

  • Is there a free-embedded SSH solution ?

    - by ereOn
    Hi, I'm working for an important company which has some severe network policies. I'd like to connect from my work, to my home linux server (mainly because it allows me to monitor my home-automated installation, but that's off-topic) but of course, any ssh connection (tcp port 22) to an external site is blocked. While I understand why this is done (to avoid ssh tunnels I guess), I really need to have some access to my box. (Well, "need" might be exagerated, but that would be nice ;) Do you know any web-based solution that I could install on my home linux server that would give me some pseudo-terminal (served using https) embedded in a web page ? I'm not necessarily looking for something graphical: a simple web-embedded ssh console would do the trick. Or do you guys see any other solution that wouldn't compromise network security ? Thank you very much for your solutions/advices.

    Read the article

  • Implement service layer in MVC

    - by Dan H
    We have a defined service layer hosted in WCF. We are now building a website that will need to use the services functionality. The website is being written in ASP.NET MVC 4 and I'm trying to decide how to reference the WCF service from the MVC app. It's a large complex website and it will be changing on a weekly basis. My first reaction is to abstract out the service references (About 7 services on this one WCF host) and create a service ref facade library with which the website interacts. But, I don't know exactly how to use the service facade in MVC. I'm starting to think the Models will be responsible for it because when the controller gets a model, that model should call the service (if needed) and return what the controller asked. I'm trying to avoid having the MVC app know details of the service references. So, I could have a model factory that creates whatever model the controllers need and they can use the service facade to accomplish it. Is this a good plan, or am I off track?

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • Web CRM Software for Sales People

    Customer Relationship Management software or CRM is an application specifically designed to help a company in managing complete data related to its customers, from advertising to initial lead, sale a... [Author: James Wong - Computers and Internet - April 29, 2010]

    Read the article

< Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >