Search Results

Search found 43006 results on 1721 pages for 'web scraping'.

Page 6/1721 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Load Balancer impact on web development

    - by confusedGeek
    This question has it's roots in a SharePoint site that I am help with. Background on the issue I dealt with: The dev box and integration server are not setup behind a load balancer. The links were being built using the HttpRequest.Url value from the current context. Note that the links weren't relative links but full URIs. Once we deployed to testing (which has a LB, amongst other things) we received errors on the links being built since the server had an address of "http://some.site.org:999" while the address at the LB as "https://site.org" (SSL was off-loaded at the LB). The fix was easy enough by using relative URIs. The Question: Since this is the first site I've worked with that's behind a Load Balancer on I'm wondering if there are other gotcha's that I need to consider when developing a site behind one?

    Read the article

  • Thoughts on web development architecture through integrating C++ in the future to a web application

    - by Holland
    I'm looking to build a website (it's actually going to be a commercial startup) I saw this question and it really shed some light on a few things that I was hoping to understand (kudos to the op). After seeing that, it would make sense that, unless the website were required to actually have millions of hits per day, it wouldn't be a viable solution to write a C++ backend on the server side. But this got me thinking. what if it in the (unlikely) events of the future, it does go that route? The problem is that, while I'm thinking of starting this all using .Net (in the beginning) just to get something quick and easy up without a lot of hassle (in terms of learning), and then moving towards something more Open Source (such as Python/Django or RoR) later to save money and to support OSS, I'm wondering IFF the website actually becomes big, will it be a good idea to integrate a C++ backend, and use Python ontop of C++ for a strong foundation, and then mitigate HTML/CSS/AJAX/etc ontop of the backend's foundation? I guess, what I'd like to know is that, given the circumstance, if this were to happen, would it be a proper approach in terms of architecture? I'd definitely be supporting MVC as that seems to be a great way to implement a website. All in all, would one consider this rational, or are there other alternatives? I like .Net, and I'd like to use it in the beginning, because I have much more experience with that than, say, Python or PHP, and I prefer it in general, but I really do want to support OSS in the future. I suppose the sentence I'm looking for is, "is this pragmatic?"

    Read the article

  • Best practices for web page styling with CSS?

    - by adifire
    I have a website to design. I have information on how the page should look and interact. The problem is I'm not good in front-end design, and have put many many hours to get the hang of the stuff. Currently, i am getting the CSS from sample sites in github and use them to style my site, which seems to be Not a ethical way. Question: how do you style webpages? Are there some really good tools? I would be deeply appreciated if a detailed answer will bee provided or link to wiki will work as well.

    Read the article

  • Technologie Roadmap: Portlet JSR286 vs Widget/Gadget

    - by Aerosteak
    Hello IBM got me confused (again). For many years IBM have been pushing for Portlet Containers with the JSR 168 and later the JSR 286 Specification. 2008-2009, IBM the Lotus division introduced the iWidget Specification. Based on my reading, it is a more dynamic and lightweight version of the Portlets, close to Google Gadget. It uses a different paradigm than Porlet while providing the same features. A major differentiator with this kind of client side technologies is that you don’t need a big and costly Portal infrastructure. To not fall in the ‘It depends on needs’ discussions, let consider the following: * New company, no legacy portlet, no portal in place. What are your thoughts on this?

    Read the article

  • One page using querystring or many folders and pages?

    - by ClarkeyBoy
    I have an application where I have the 'core' code in one folder for which there is a virtual directory in the root, such that I can include any core files using /myApp/core/bla.asp. I then have two folders outside of this with a default.asp which currently use the querystring to define what page should be displayed. One page is for general users, the other will only be accessible to users who have permission to manage users / usergroups / permissions. The core code checks the querystring and then checks the permissions for that user. An example of this as it is now is default.asp?action=view&viewtype=list&objectid=server. I am not worried about SEO as this is an internal app and uses Windows Auth. My question is, is it better the way it is now or would it be better to have something like the following: /server/view/list/ /server/view/?id=123 /server/create/ /server/edit/?id=123 /server/remove/?id=123 In the above folders I would have a home page which defines all the variables which are currently determined by the querystring - in /server/create/ for example, I would define the action as 'create', object name as 'server' and so on. In terms of future development, I really have no idea which method would be best. I think the 2nd method would be best in terms of following what page does what but this is such a huge change to make at this stage that I would really like some opinions, preferably based on experience. PS Sorry if the tags are wrong - I am new to this forum and thought this was a bit too much of a discussion for StackOverflow as that is very much right / wrong answer based. I got the idea SE is more discussion based.

    Read the article

  • Looking for Application Framework Features Lists, Comparisons and Guides [closed]

    - by Blah McBlah
    I am looking for lists of the things that application frameworks can do and for websites that have matrices, marketing content, blog articles and whatnot for comparing application frameworks to each other or just selling a framework. I'm talking generally, so regardless of coded language or operating system or client device. I want it all. I've found a few online, and would appreciate whatever sources I can glean from this site too.

    Read the article

  • Can you be a web and desktop developer at the same time?

    - by Charmop
    In my environment, I found web programmers, desktop programmers and both web and desktop programmers. About myself I started my career with desktop development using C and then Java, did couple of simple level projects. Then at the final graduating year, my project was a web one, so I turned to web development until this moment. But, when I meet people having chosen to be web or software developers from the beginning, I figure out that they have more knowledge/experience than I have. So I get kind of regret why didn't I specialize my self from the first day? The question is: Is it a good habit to work at two, more or less, different fields: web and desktop? Or we must specialize ourselves?

    Read the article

  • What are the pre-requisites for writing .NET web services?

    - by wackytacky99
    I am very new to web development. I have been a C,C++ programmer for 5 years and I'm starting to get into the web development, writing web services, etc. I understand that basic concepts of web services. I know .Net web services can be written in VB or C#. Working with C,C++ will help getting used to writing code in C#. I do not have experience in .Net framework. I'd like to quickly get into writing .Net web services and learning on the go, without extensively spending a lot of time learning .Net framework (if possible) Any suggestions? Update - I know my way around databases and sql express in Visual Studio

    Read the article

  • SEO for single-page content-less Web App

    - by brillout.com
    as written in the title the website on which I try to do Search Engine Optimization has following two properties: doesn't have any content in the SEO sense: it doesn't hold any information and only offers functionality consists of only one page/URL since most of the SEO tips/tricks I read are based on content how do I perform SEO optimization on such a website? for more info: the website is basically just a timer/alarm/stopwatch

    Read the article

  • Change dynamic web reference from web./app.config

    - by Snæbjørn
    I have a problem changing a dynamic web reference in the config file. Changing the url in the config file doesn't have any effect. I have to change the url in .settings and compile for it to change. I added the web reference using the wizard. Set the URL behavior to dynamic, which added the relevant XML tags in config file. In my solution I have the web API (web reference) in a separate project (class lib), so I referenced the project and copied the <applicationSettings> over. <applicationSettings> <StartupProject.Properties.Settings> <setting name="WebReference" serializeAs="String"> <value>http://someurl/somefile.asmx</value> </setting> </StartupProject.Properties.Settings> </applicationSettings> Note that it's <StartupProject.Properties.Settings> and not <WebRefProject.Properties.Settings>. Are there some limitations I'm not aware of or am I doing something wrong?

    Read the article

  • WebApps tendency

    - by Narek
    There is a strong tendency of making web apps and even seems that very soon a lot of features will be available online so that for every day use people will have all necessary software free online and they will not need to install any software locally. Only specific (professional) tools that usually people don’t use at home will not be available as a web app. So my question, how do you imagine selling software that was necessary for everyday use and was not free (seems they can't make money any more by selling their product – no need of those products). And what disadvantages have web apps, that is to say, what is bad to use software online compared with having the same software locally (please list)? Please do not consider this question not connected with programming, as I would like to have a little statistics from professional programmers who are aware from nowday’s tendency of software and programming. Thanks.

    Read the article

  • Top techniques to avoid 'data scraping' from a website database

    - by Addsy
    I am setting up a site using PHP and MySQL that is essentially just a web front-end to an existing database. Understandably my client is very keen to prevent anyone from being able to make a copy of the data in the database yet at the same time wants everything publicly available and even a "view all" link to display every record in the db. Whilst I have put everything in place to prevent attacks such as SQL injection attacks, there is nothing to prevent anyone from viewing all the records as html and running some sort of script to parse this data back into another database. Even if I was to remove the "view all" link, someone could still, in theory, use an automated process to go through each record one by one and compile these into a new database, essentially pinching all the information. Does anyone have any good tactics for preventing or even just dettering this that they could share. Thanks

    Read the article

  • Screen scraping software that will traverse pages

    - by nilbus
    We're creating a mashup site that pulls information from many sources all over the web. Many of these sites don't provide RSS feeds or APIs to access the information they provide. This leaves us with screen scraping as our method for collecting the data. There are many scripting tools out there written in different scripting languages for screen scraping that require you to write scraping scripts in the language the scraper was written in. Scrapy, scrAPI, and scrubyt are a few written in Ruby and Python. There are other web-based tools I've seen like Dapper that create XML or RSS feeds based on a webpage. It has a beautiful web-based interface that requires no scripting skills to use. This would be a great tool, if it were able to traverse multiple pages to gather data from hundreds pages of results. We need something that will scrape information from paginated web sites, much like scrubyt, but with a user interface that a non-programmer could use. We'll script up our own solution if we need to, probably using scrubyt, but if there's a better solution out there, we want to use it. Does anything like this exist?

    Read the article

  • Screen scraping over SSL with .NET

    - by Even Mien
    What solutions exist for screen scraping a site over SSL for use with .NET? My use case is that I need to login to a partner website (https), navigate through a dynamic hierarchy, and download a zipped file of reports. I certainly could use other screen scrapers if there are no good viable options in .NET, either though the framework or OSS.

    Read the article

  • Latest evoluations in web content display

    - by srinannapa
    I Would like to know about the latest techiniques ,how the content is managed in web-pages ( I use Java technology). Probably i'm expecting like an RSS feed ,is one among them i can implement in my webpages. I need some more latest technologies of this kind to implement in web pages. Srinivas

    Read the article

  • screen scraping

    - by sam
    hello folks., i am screen scraping a website which is in danish language.. i am unable to scrape certain characters as like må .. any idea to solve this? thanks

    Read the article

  • Scraping data from Flash (Games)

    - by awegawef
    I saw this video, and I am really curious how it was performed. Does anyone have any ideas? My intuition is that he scraped pixels from the screen (one per 'box'), and then fed that into some program to determine the next move. Is scraping pixel-by-pixel the way to do this, or is there a better way? I am looking to do something similar with either Java or Python. Thanks

    Read the article

  • HTML Scraping in Php.

    - by tsellon
    I've been doing some html scraping in PHP using regular expressions. This works, but the result is finicky and fragile. Has anyone used any packages that provide a more robust solution? A config driven solution would be ideal, but I'm not picky.

    Read the article

  • Scraping &#151 character (long dash) error in Nokogiri

    - by DavidP6
    I having trouble scraping a certain long dash that is encoded as — ; on the Time magazine site. It looks like this: —. It works fine when this dash is encoded as mdash, but when the problem dash is scraped, it is returned as unknown characters. I am using Nokogiri and am wondering if I have to use some sort of special encoding? The page says it is encoded with UTF-8.

    Read the article

  • High latency issue for web service call from amazon aws ec2 to local server

    - by SibzTer
    We have a legacy web application that is running in our data center on premises located in Houston. We have a developed a new .net 4 based web application in order to provide new features to customers. The new web application is hosted in amazon aws ec2 environment (N. Virginia region us-east-1b zone). In order to get seamlessly integrate with the legacy application the new web application makes web service calls to retrieve data. We are seeing an unusually high latency time in the order of 5+ seconds for these web service calls. The exact same web service call returns in less than a second on our local PCs (which makes sense given physical proximity to the actual server). The weird part is that we have developers in California who also have the same milliseconds response time. We are testing the web service response using third party tools such as SoapUI, Google Chrome extensions such as Advanced REST Client, Postman REST Client, etc. As if this wasnt weird enough, we have noticed the same low latency from certain other ec2 instances while testing which are in the same region and availability zone as well. If we experienced the high latency consistently from all the ec2 instances I could understand. But there is something else going on. Comparing the various stats and results between the low latency and high latency ec2 servers do not show any significant differences: ping (constant 40ms), tracert, winmtr, etc. We have instances that are in the VPC as well. So I tried both the public and private IP address of the web service host server and that didnt make a difference either for the above results. We need to resolve this latency issue as this is causing the resulting web pages to load very slowly (almost 15+ seconds which is simply unacceptable). The ec2 instances have Windows Server Datacenter 64 bit. Let me know if there is any other infor I can provide to help diagnose this.

    Read the article

  • How one does qualify as a Web UI Developer?

    - by Duralumin
    I have about 20 years of experience with programming, most of that on the job, and right now, I define myself as a Web Developer, because I think about half my expertise lies in the all too extended "web" field, both server side and client side, and because in the last years I'm mostly doing web development. I know my javascript, jQuery, jQueryUI, HTML4-5, css2-3 and some frameworks like backbone.js and angularJS Since university I've always been interested in Man-Machine Interaction, UI and UX. Recently, I saw the label "Web UI Developer" tossed around, and I thought that would be something I would like to qualify for. And I'd really like to qualify with confidence. I didn't find any certificate or similar, and I don't think there are any. Is the only way to qualify as a Web UI Developer having a job as one? What are the skills I need to have, and the resources I can use to acquire them?

    Read the article

  • Web publishing system with code highlighting

    - by Dragos Toader
    I'd like to publish some of the many programs I've written on the web. Is there a syntax highlighting Linux web publishing application (CMS/Blog/RoR app) that displays syntax for C++, Python, Bash scripts, SQL, VBA, awk, Erlang, java, makefiles, Ruby, Pascal and other languages? The more syntax settings configuration files, the better. The extensions I have in Textpad (for which I have syntax highlighting -- syn files) are .as, .asm, .asp, .awk, .bas, .bat, .c, .conf, .cpp, .cs, .ctl, .dfm, .dsc, .erl, .fnc, .h, .hpp, .inf, .ini, .jav, .java, .mak, .nsh, .nsi, .ora, .pas, .pkb, .pks, .pl, .prc, .py, .reg, .rsp, .sh, .sql, .syn, .tcl, .trg, .vw, .xml, .xsl, .xslfo

    Read the article

  • page posting issue when working in Screen Scraping

    - by Muhammad Akhtar
    Hi, I am working on screen scraping and done successfully in 3 websites, I have an issue in last website here is my url, When I hit with my parameter, it is showing result on next page, simply posting to other page and showing the result fine on other page Here is My Test However, when I hit from my application, since here I don't have an option to post, it only fetch html of requested page that is obviously my above mention HTML test link, that actually have parameter in URL to get the result. How can I handle this situtation? Please give me hint. Thanks here is my C# code, I am using HTMLAgality String url; HtmlWeb hw = new HtmlWeb(); HtmlDocument doc; url = "http://mysampleURL"; doc = hw.Load(url);

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >