Search Results

Search found 42917 results on 1717 pages for 'web spider'.

Page 225/1717 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Would it be possible to build a client portal on Squarespace6?

    - by aBathologist
    I'm helping a family member set up a site which will need to include a secure client portal, providing access to documents and a simple database. I have been encouraging them to go with a more established, open source CSM like drupal or joomla, whose capability in this area is evident. However, they have a strong preference for Squarespace. Does any one know if it would be possible to accomplish this with the new developer platform for squarespace 6? I've spent well over an hour searching google, the squarespace site and stackexchange, but can't seem to find any clear answer to this question. I'm grateful for any insight you all can provide.

    Read the article

  • Hosting a website from a dynamic IP

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this 1) convince my ISP to give me a static IP 2) assign my router my current IP to force a static IP (which might work?) 3) set my dns record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Has anyone had any luck with something like that?

    Read the article

  • What is best way to manage all images in a big project, inline images, background images, css sprite images?

    - by metal-gear-solid
    How do you manage all images in a big project, inline images, background images, css sprite images? Do you follow any naming convention? Do you create sub-folders to manage images? In a big project how to make it easy to find for new people in the development team if any images which they want to use (because it's in new PSD they received from designer) is already available in images folder of project and how they can find it easily.

    Read the article

  • Beginner's Walk - Web Development

    This Table of Contents is editable by all Silver members and above. What we want you to do is replace the entries in the Table of Contents below with links to articles that represent the entries.

    Read the article

  • What should I aware of , when preparing a document of website for later maintenance use?

    - by user782104
    The development team has finished a website and my duty is to prepare a document so that other programmer can maintain the website with ease . As I am inexperience of that, I would like to ask what should be mentioned (document structure) in that report? So far my idea is only prepare a ERD diagarm for database and flow chart for each function. Any other suggestions, eg. what cookies stored ? Thanks

    Read the article

  • Is my webhost infected? [on hold]

    - by Svein Erik
    I have 2 websites, and on both websites, randomly on time, get's redirected automatically to porn sites. I can't figure out if it's something in the code or if the webhost is infected somehow. The two sites is: http://www.storkas.com/vm and http://www.prowebdesign.no/vm. Where should i start to find out what's happening..? I scanned the sites on hxxp://sitecheck.sucuri.net. One odd thing i saw there, is a reference to: hxxp://js.nohealth.org/js/jquery-1.1.js. I do not have this in my html code..

    Read the article

  • Lost support for Web Access on Verizon BlackBerry World Edition

    - by Jimsmithkka
    Hello all, I believe that some silliness has occurred with my blackberry after a OS upgrade. I have 2010 Blackberry world edition phone, purchased off a friend who went iPhone, that at first worked with web on the Verizon network. When i connected it to my PC to transfer contacts, it prompted for an OS upgrade, which I performed. Post-Upgrade I have found that i can no longer access any of the web services: eg. AppWorld, Email, Twitter, Browser. And they all state that i need to upgrade my account to gain access. I had a Storm previous to this that worked fine, and at the VZ store they told me this device is no longer supported (new in 2010 though), and they got me a free "upgrade" to the Blackberry flip. What i could use help with is finding a source stating it is discontinued or a guide that will help me re-enable the web features. I can provide further info later if needed (currently at work with the flip, the WorldEdition is at home).

    Read the article

  • Handle all authentication logic in database or code?

    - by Snuffleupagus
    We're starting a new(ish) project at work that has been handed off to me. A lot of the database sided stuff has been fleshed out, including some stored procedures. One of the stored procedures, for example, handles creation of a new user. All of the data is validated in the stored procedure (for example, password must be at least 8 characters long, must contain numbers, etc) and other things, such as hashing the password, is done in the database as well. Is it normal/right for everything to be handled in the stored procedure instead of the application itself? It's nice that any application can use the stored procedure and have the same validation, but the application should have a standard framework/API function that solves the same problem. I also feel like it takes away the data from the application and is going to be harder to maintain/add new features to.

    Read the article

  • How to Use the Google Keyword Tool to Pimp Your Web Page

    For all the search engines, and maybe especially for Google, topical relevance is everything. The words on your webpage will get ranked for relevance on one or more topics or categories. There are probably 100 or more other factors that go into a Google ranking, most of which we will never know, but the major ranking factor has to be the words on your page and their proximity to each other.

    Read the article

  • My Sites Were Hacked. What To Do?

    - by Vad
    I host multiple domains with this very popular hosting provider and I just went into one of my sites and... I see a black page with message "Hacked by...". I checked and all my sites with the provider are showing this same page. Inside of file system I have seen the hacker placed all default.* and index.* files with this message. So the hacker overwrote all index pages, placed new pages and that is under every, I say again, every folder. Cleaning this up will be close to a most horrible job. What to do (right now I am awaiting the restore of files from hosting provider)? How to prevent this? Whom to blame?

    Read the article

  • Best way to move your domain and keep the Google position

    - by netadictos
    I have to move one domain to a new one which is semantically better for SEO. I would like to know the best way to do it so that the new domain keeps the google position. I know the basic steps: to put a redirection 301 in the old one, with an apache script, it can be very detailed, but the important is the 301 header for google you can tell them through the webmaster tools page try to gain pagerank for the new domain

    Read the article

  • Older PHP v/s newer PHP version [closed]

    - by Monty
    My company is building a website with database. Programmer's used PHP 5.0. My Service Provider (shared) in the meantime upgraded to PHP 5.3.0. Fixes have been on going and seem endless... Do I move to VPS and install older PHP or should we rebuild with newer PHP? When working remotely with programers what is the protocol regarding delivery of all code? Please what is the industry standard? I need an independent to review their work. How should this be approached?

    Read the article

  • Web Development Company - Helping Online Business to Flourish

    In the present times, the internet provides tremendous advantage to the online business owners. The vast reach on global scale has made internet a popular marketing media. It offers a cheap and reliable platform to launch an online venture. From the comforts of the office or home, people can easily manage their online business, irrespective of their location or time zones.

    Read the article

  • Abandoment to blame?

    - by Larsenal
    I have a code snippet for an app that users are loading as a 3rd party script on their site. The general sequence is as follows: Site loads "http://www.example.com/foo.js" foo.js does stuff 1 to 2 seconds later, foo.js loads bar.js Now in a perfect world, I'd want to see matching counts for the calls to foo.js and bar.js. However, bar.js loads only about 94% of the time. I'm wondering how much of this discrepancy might be attributable to site abandonment given the fact that bar.js is delayed by 1 or 2 seconds. I posted here instead of StackOverflow since I think it's more a question about what would be typical time on page when users abandon the page.

    Read the article

  • Run server side script

    - by ooo
    I'm in the process of deploying my first website which is written is ASP.NET. I need to run a server side script at set intervals during the day which updates a database even if there is nobody using the site. I was led to believe that using Windows task scheduler would be the best option but now I've joined a hosting company the layout is not really how I was expecting. It's a shared hosting with basic FTP and no apparent built in task scheduler. The hosting company support is not very good and haven't been able to advise how I could do this so hoped to get help here on options before I consider changing company. [The hosting company starts with 1 and ends with 1 :)]

    Read the article

  • implementing dynamic query handler on historical data

    - by user2390183
    EDIT : Refined question to focus on the core issue Context: I have historical data about property (house) sales collected from various sources in a centralized/cloud data source (assume info collection is handled by a third party) Planning to develop an application to query and retrieve data from this centralized data source Example Queries: Simple : for given XYZ post code, what is average house price for 3 bed room house? Complex: What is estimated price for an house at "DD,Some Street,XYZ Post Code" (worked out from average values of historic data filtered by various characteristics of the house: house post code, no of bed rooms, total area, and other deeper insights like house building type, year of built, features)? In addition to average price, the application should support other property info ** maximum, or minimum price..etc and trend (graph) on a selected property attribute over a period of time**. Hence, the queries should not enforce the search based on a primary key or few fixed fields In other words, queries can be What is the change in 3 Bed Room house price (irrespective of location) over last 30 days? What kind of properties we can get for X price (irrespective of location or house type) The challenge I have is identifying the domain (BI/ Data Analytical or DB Design or DB Query Interface or DW related or something else) this problem (dynamic query on historic data) belong to, so that I can do further exploration My findings so far I could be wrong on the following, so please correct me if you think so I briefly read about BI/Data Analytics - I think it is heavy weight solution for my problem and has scalability issues. DB Design - As I understand RDBMS works well if you know Data model at design time. I am expecting attributes about property or other entity (user) that am going to bring in, would evolve quickly. hence maintenance would be an issue. As I am going to have multiple users executing query at same time, performance would be a bottleneck Other options like Graph DB (http://www.tinkerpop.com/) seems to be bit complex (they are good. but using those tools meant for generic purpose, make me think like assembly programming to solve my problem ) BigData related solution are to analyse data from multiple unrelated domains So, Any suggestion on the space this problem fit in ? (Especially if you have design/implementation experience of back-end for property listing or similar portals)

    Read the article

  • How can I superscript a single character throughout my site?

    - by dvn66
    Finishing up development work for a client and they have a late request to superscript every occurrence of the &reg character throughout the site. I merely get by with CSS, but not even sure CSS provides the best solution here. how can I make a global change to every occurrence of a single character in the site copy without going back and adding tags everywhere? if the best solution does involve inserting a span or other tag I'm thinking it could be accomplished with a database query (all the copy is wired up to a CMS).

    Read the article

  • Abandoment to blame for the last JavaScript file not always being loaded?

    - by Larsenal
    I have a code snippet for an app that users are loading as a 3rd party script on their site. The general sequence is as follows: Site loads http://www.example.com/foo.js foo.js does stuff 1 to 2 seconds later, foo.js loads bar.js Now in a perfect world, I'd want to see matching counts for the calls to foo.js and bar.js. However, bar.js loads only about 94% of the time. I'm wondering how much of this discrepancy might be attributable to site abandonment given the fact that bar.js is delayed by 1 or 2 seconds. I posted here instead of StackOverflow since I think it's more a question about what would be typical time on page when users abandon the page.

    Read the article

  • Is Moving Entity Framework objects over a webservice really the best way?

    - by aceinthehole
    I've inherited a .NET project that has close to 2 thousand clients out in the field that need to push data periodically up to a central repository. The clients wake up and attempt to push the data up via a series of WCF webservices where they are passing each entity framework entity as parameter. Once the service receives this object, it preforms some business logic on the data, and then turns around and sticks it in it's own database that mirrors the database on the client machines. The trick is, is that this data is being transmitted over a metered connection, which is very expensive. So optimizing the data is a serious priority. Now, we are using a custom encoder that compresses the data (and decompresses it on the other end) while it is being transmitted, and this is reducing the data footprint. However, the amount of data that the clients are using, seem ridiculously large, given the amount of information that is actually being transmitted. It seems me that entity framework itself may be to blame. I'm suspecting that the objects are very large when serialized to be sent over wire, with a lot context information and who knows what else, when what we really need is just the 'new' inserts. Is using the entity framework and WCF services as we have done so far the correct way, architecturally, of approaching this n-tiered, asynchronous, push only problem? Or is there a different approach, that could optimize the data use?

    Read the article

  • Vanilla forum personal journal tool

    - by user16648
    I am developing a forum for a research project, i am using Vanilla forum (though i am not tied to this yet). Another feature of this site will be a personal journal/diary/blog. Only the user and administrators will have access to the journal. The journal does not need any advanced features. Does anybody have any suggestions on software/script that could easily be integrated with Vanilla forum for this purpose? My first thought was using Wordpress but it is a bit to complex to adminstrate as the site is meant to be simple. And i dont really see how it can be intregrated easily the way i want it.

    Read the article

  • Hosting a magnet link site which could possibly infringe copyrighted material?

    - by Griff
    I have for the last 3 months built a crawler, indexer and alot of other things for what started out to be a home project for indexing magnet links on the internet. As my project grew I have thought about releasing my collected data (which at the minute is on a public domain but with no access) to the public. Whatever the crawler sucks in goes in, and whatever the indexer decides to index gets indexed as it is a fully automated process. My question is as follows; Considering that most of the data that is collected from what I have built points to illegal copyrighted material (as most magnet links do) where would it be best to host such a site. I notice all of the already public torrent sites are hosted in India is this because there laws are less strict on copyright infringement? Have any of you hosted such a site, and if so what problems have you ran into? And as always any advice on being a webmaster for this type website?

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >