Search Results

Search found 42868 results on 1715 pages for 'web crawlers'.

Page 107/1715 | < Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >

  • C# Web gridview sortin

    - by tommypiaa
    Any examples how would I enable sorting for the gridview? private void loadlist() { cn.Open(); cmd.CommandText = "select Breed, Name, Image from Animals"; dr = cmd.ExecuteReader(); cn.Close(); } protected void TextBox1_TextChanged(object sender, EventArgs e) { } protected void Button1_Click(object sender, EventArgs e) { AddData(); Addimage(); } protected void AddData() { if (TextBox1.Text != "" & TextBox2.Text != "") { cn.Open(); cmd.CommandText = "insert into Animals (Breed, Name) values ('" + TextBox1.Text + "', '" + TextBox2.Text + "')"; cmd.ExecuteNonQuery(); cmd.Clone(); cn.Close(); loadlist(); } } protected void DisplayData() { cn.Open(); cmd.CommandText = "select Breed, Name from Animals"; dr = cmd.ExecuteReader(); GridView1.DataSource = dr; GridView1.DataBind(); cn.Close(); }

    Read the article

  • How to store a 250mb database in an Offline Web App

    - by Couto
    Ok, maybe i'm not seeing the whole picture or something, but i kinda need a brainstorm. So the purpose is to make a webapp (HTML5, CSS, Javascript) that has to search on a 250mb database without any internet connection, so.. yes the database has to be on the client side. The hard part here is, this App has to work on an iPod or iPhone without internet connection. (An initial connection to download the App is ok), LocalStorage has a 5mb limit, couchDB would be great since they have an webapp easily accessed by Javascript (privacy concerns don't matter at this point), so i'm pretty much out of ideas.... Does anyone see an alternative, or solution for the purpose?

    Read the article

  • Connect to web-service/API in MySQL?

    - by Jesse Figueroa
    I'm creating a sql based procedure which can Accept a table load the values one at a time send the variables to a remote API Record the response of the API Write the response to a table for viewing later I have successfully implemented 1,2, and 5. I am hoping there may be some way of choosing an address to contact and for SQL to listen too for a response. Please let me know if you have any suggestions!

    Read the article

  • Hash passwords before transmitting? (web)

    - by wag2639
    I was reading this Ars article on password security and it mentioned there are sites that "hash the password before transmitting"? Now, assuming this isn't using an SSL connection (HTTPS), a. is this actually secure and b. if it is how would you do this in a secure manor? Edit 1: (some thoughts based on first few answers) c. If you do hash the password before transmission, how do you use that if you only store a salted hash version of the password in your user credentials databas? d. Just to check, if you are using a HTTPS secured connection, is any of this necessary?

    Read the article

  • Loading jQuery multiple times in the same page

    - by Winaji
    I'm implementing a plug-in that's embeddable in different sites (a la Meebo, Wibiya), and I want to use jQuery. Problem is, the site I'm embedding to, may already have a jQuery script loaded. The question is, what's the best approach for this kind of problem: Should I just check if jQuery is already loaded and if so, use the original site's jQuery, otherwise load it myself? If I use this approach, am I not risking comaptibility problems (i.e. the site uses an old version of jQuery)? Should I load jQuery myself (whether it's already loaded or not) and call "jQuery.noConflict(true)" when it's finished loading? If so, how can I make sure that my jQuery has finished loading (hooking to the onLoad event doesn't seem to work all the time, and polling for "jQuery" won't work for obvious reasons)? Should I do something else? Thanks.

    Read the article

  • Invoke web page from Linux C

    - by umetzu
    Hi, i need to get all the HTML TEXT from a url "http://localhost/index.html" to a String variable on C I know that if i put on telnet - telnet www.google.com 80 Get webpage.... it returns all the html. How i can do it? im on linux enviroment? with C (NOT C++). BTW im .net programmer :/

    Read the article

  • Web scrapping from a Google Chrome extension

    - by limoragni
    I've started to develop a Chrome extension to navigate and prform actions on a website. Until now the extension is able to receive a couple of parameters and check a set of radio-buttons, fill in a few inputs of a form and then submit it. What I want to do now is to repeat the process, but I'm stuck when the page is reloaded. And I don't know how can I do to make the script reacts to the finish of the request. The workflow I want to achieve is the following (is for automaticly copying a certain object): Popup side Enter the number of the Master object to copy Enter the base name of the copies (example Mod, so the I can iterate and add mod1, mod2, modn) Enter the number of copies Background side Select master Select standard options Fill in inputs Submit form Wait for the page to complete the request and continue to the next copy. (here I need help) The problem is on the repetition, the rest is taking care of. I assume that must be a way of dealing with requests. Any ideas? By the way I'm doing it all with the extension and tabs methods of google chrome plus javascript and jquery.

    Read the article

  • Web hosting a setup where both Windows and Linux are required

    - by lk
    I have a system where a website needs to be hosted on a Linux machine while a backend application that the site talks to needs to reside on Windows. Is there any "common practice" for such hosting? Note - both of the systems are mine so there is the dilemma of whether to have the machines physically located together to avoid delay for calls over the net.

    Read the article

  • Acessing elements of this xml

    - by csU
    <wsdl:definitions targetNamespace="http://www.webserviceX.NET/"> <wsdl:types> <s:schema elementFormDefault="qualified" targetNamespace="http://www.webserviceX.NET/"> <s:element name="ConversionRate"> <s:complexType> <s:sequence> <s:element minOccurs="1" maxOccurs="1" name="FromCurrency" type="tns:Currency"/> <s:element minOccurs="1" maxOccurs="1" name="ToCurrency" type="tns:Currency"/> </s:sequence> </s:complexType> </s:element> <s:simpleType name="Currency"> <s:restriction base="s:string"> <s:enumeration value="AFA"/> <s:enumeration value="ALL"/> <s:enumeration value="DZD"/> <s:enumeration value="ARS"/> i am trying to get at all of the elements in enumeration but cant seem to get it right. This is homework so please no full solutions, just guidance if possible. $feed = simplexml_load_file('http://www.webservicex.net/CurrencyConvertor.asmx?WSDL'); foreach($feed->simpleType as $val){ $ns s = $val->children('http://www.webserviceX.NET/'); echo $ns_s -> enumeration; } what am i doing wrong? thanks

    Read the article

  • Drupal and Back-End Complexity

    - by Brian
    Currently I am working on a school website, and we are still in the decision-making process of choosing a framework (we know that we're not using Joomla! or hand-coding). Drupal came up as a viable choice, and currently, that is my best bet for the site. However, I have an issue with CMS's in general. I would like to develop a quite complicated and specifically custom-suited back-end application for teachers to interact with individual students, including the design of shared/custom calendars, announcement privileges, etc. I currently have a bit of expertise with HTML, CSS, PHP, MySQL, and I could wrap my head around some JavaScript and AJAX stuff if need-be. However, would such a complicated application work with Drupal (in that I could create it to specifically suite my purposes)?

    Read the article

  • Understanding configuration for parallel calling in web app (IIS + MS SQL)

    - by mmcteam.com.ua
    We have an ASP.NET MVC application + IIS 7.5 + SQL Server 2008 R2. We have to load a lot of aggregate counters on the each page. We decided to use ajax and call with javascript for each counter or groups of counters and return them as JSON result. We solve the problem that user doesn't wait for page loading, page loads fast. User waits for counters loading while seeing other page content. But we thought that if we make calls from javascript - our queries will be make async, but we notice, that it is not. All our javascipt calls runs immediately, but action that they invoke are in queue. If we use Async Controller ability - all counters calculating simultaneously, but user has to wait for the longest counter calculating before page loads. The question: We want to understand what is happens if we use ajax and call two or more actions simultaneously. And how can we configuring this. (also in each action we make some queries to sql server)

    Read the article

  • A simple, clean web layout

    - by Shaun_web
    Ok, so I hate CSS/HTML graphic design... What do you guys recommend as a sample template or website that you think has great CSS and html layout? From my past experience it's best to get a page that has a white background and little dependency on graphics -- that's clean, and easy to modify ;-).

    Read the article

  • image doesnt always render on web page

    - by zsharp
    One of my png files does not always get diplayed in the browser (both in firefox and IE). In firebug the image is visible. sometimes ill even see the image start loading and halfway it fizzles. what could this be.? the image is appx 10kb.

    Read the article

  • IE8 disappearing image bug

    - by Joe Fletcher
    In IE8 (& maybe others), when I leave my page to go to another tab in IE and then come back to my page's tab, each time the cursor runs over an image it disappears until I refresh the page. I've heard of disappearing image bugs, but I couldn't find anything on this particular case, especially given this isn't a weird pre-IE8 bug. I am using a lightbox, so possibly something to do with javascript? http://dev.bwmsnow.co.nz/snowboarding

    Read the article

  • best way to make this app/website combo?

    - by mharris7190
    I have an idea for an app that lets different bars upload drink specials via a webpage to that app. They would log in the website, and the website would prompt them with a box to input the drink specials. When they are done, the app pulls the information and creates a card for that bar. The app would launch into a week view where you select the day you want to see specials for. After the day is selected, it brings you to a scroll view with different bars' cards laid out vertically, each taking up the width of the screen, allowing the user to scroll through the deals for that day of the week. How should I go about doing this if I have very little programming experience? Is there any convention in doing an app like this? Can anyone suggest any reading material that would help? Thank you!

    Read the article

  • What's the best way to make a mobile friendly site?

    - by Frew
    Speaking entirely in technology-free terms, what is the best way to make a mobile friendly site? That is, I want to make a site that will work on a regular computer but also have mobile versions of the pages. Should I rewrite each page? The pages will probably have different functionality, so should I rewrite the backend code? Should it be an effectively different site with the same database?

    Read the article

  • Does .live() binding work for jQuery in IE7?

    - by Steve
    Hi everyone, I have a piece of javascript which is supposed to latch onto a form which gets introduced via XHR. It looks something like: $(document).ready(function() { $('#myform').live('submit', function() { $(foo).appendTo('#myform'); $(this).ajaxSubmit(function() { alert("HelloWorld"); }); return false; }); }); This happens to work in FF3, but not in IE7. Any idea what the problem is?

    Read the article

  • Managing JS and CSS for a static HTML web application

    - by Josh Kelley
    I'm working on a smallish web application that uses a little bit of static HTML and relies on JavaScript to load the application data as JSON and dynamically create the web page elements from that. First question: Is this a fundamentally bad idea? I'm unclear on how many web sites and web applications completely dispense with server-side generation of HTML. (There are obvious disadvantages of JS-only web apps in the areas of graceful degradation / progressive enhancement and being search engine friendly, but I don't believe that these are an issue for this particular app.) Second question: What's the best way to manage the static HTML, JS, and CSS? For my "development build," I'd like non-minified third-party code, multiple JS and CSS files for easier organization, etc. For the "release build," everything should be minified, concatenated together, etc. If I was doing server-side generation of HTML, it'd be easy to have my web framework generate different development versus release HTML that includes multiple verbose versus concatenated minified code. But given that I'm only doing any static HTML, what's the best way to manage this? (I realize I could hack something together with ERB or Perl, but I'm wondering if there are any standard solutions.) In particular, since I'm not doing any server-side HTML generation, is there an easy, semi-standard way of setting up my static HTML so that it contains code like <script src="js/vendors/jquery.js"></script> <script src="js/class_a.js"></script> <script src="js/class_b.js"></script> <script src="js/main.js"></script> at development time and <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script> <script src="js/entire_app.min.js"></script> for release?

    Read the article

  • MS Bing web crawler out of control causing our site to go down

    - by akaDanPaul
    Here is a weird one that I am not sure what to do. Today our companies e-commerce site went down. I tailed the production log and saw that we were receiving a ton of request from this range of IP's 157.55.98.0/157.55.100.0. I googled around and come to find out that it is a MSN Web Crawler. So essentially MS web crawler overloaded our site causing it not to respond. Even though in our robots.txt file we have the following; Crawl-delay: 10 So what I did was just banned the IP range in iptables. But what I am not sure to do from here is how to follow up. I can't find anywhere to contact Bing about this issue, I don't want to keep those IPs blocked because I am sure eventually we will get de-indexed from Bing. And it doesn't really seem like this has happened to anyone else before. Any Suggestions? Update, My Server / Web Stats Our web server is using Nginx, Rails 3, and 5 Unicorn workers. We have 4gb of memory and 2 virtual cores. We have been running this setup for over 9 months now and never had an issue, 95% of the time our system is under very little load. On average we receive 800,000 page views a month and this never comes close to bringing / slowing down our web server. Taking a look at the logs we were receiving anywhere from 5 up to 40 request / second from this IP range. In all my years of web development I have never seen a crawler hit a website so many times. Is this new with Bing?

    Read the article

  • Sharing authentication methods across API and web app

    - by Snixtor
    I'm wanting to share an authentication implementation across a web application, and web API. The web application will be ASP.NET (mostly MVC 4), the API will be mostly ASP.NET WEB API, though I anticipate it will also have a few custom modules or handlers. I want to: Share as much authentication implementation between the app and API as possible. Have the web application behave like forms authentication (attractive log-in page, logout option, redirect to / from login page when a request requires authentication / authorisation). Have API callers use something closer to standard HTTP (401 - Unauthorized, not 302 - Redirect). Provide client and server side logout mechanisms that don't require a change of password (so HTTP basic is out, since clients typically cache their credentials). The way I'm thinking of implementing this is using plain old ASP.NET forms authentication for the web application, and pushing another module into the stack (much like MADAM - Mixed Authentication Disposition ASP.NET Module). This module will look for some HTTP header (implementation specific) which indicates "caller is API". If the header "caller is API" is set, then the service will respond differently than standard ASP.NET forms authentication, it will: 401 instead of 302 on a request lacking authentication. Look for username + pass in a custom "Login" HTTP header, and return a FormsAuthentication ticket in a custom "FormsAuth" header. Look for FormsAuthentication ticket in a custom "FormsAuth" header. My question(s) are: Is there a framework for ASP.NET that already covers this scenario? Are there any glaring holes in this proposed implementation? My primary fear is a security risk that I can't see, but I'm similarly concerned that there may be something about such an implementation that will make it overly restrictive or clumsy to work with.

    Read the article

  • Identity Propagation for Web Service - 11g

    - by Prakash Yamuna
    I came across this post from Beimond on how to do identity propagation using OWSM.As I have mentioned in the past here, here and here - Beimond has a number of excellent posts on OWSM. However I found one part of his comment puzzling. I quote: "OWSM allows you to pass on the identity of the authenticated user to your OWSM protected web service ( thanks to OPSS ), this username can then be used by your service. This will work on one or between different WebLogic domains. Off course when you don't want to use OWSM you can always use Oracle Access Manager OAM which can do the same." The sentence in red highlights the issue i find puzzling. In fact I just discussed this particular topic recently here. So let me try and clarify on a few points: a) OAM is used for Web SSO. b) OWSM is used for securing Web Services. You cannot do identity propagation using OAM for Web Services. c) You use SAML to do identity propagation across Web Services. OAM also supports SAML - but that is the browser profile of SAML relevant in the context of Web SSO and is not related to the SAML Token Profile defined as part of the WS-Security spec.

    Read the article

  • Just Another Web Service (JAWS) vs SOA

    Over the last few years SOA has been a hot topic lending it to be abused by many that have no understanding of the concept. In my opinion, one of the largest issues facing SOA is the lack of understanding and experience implementing SOA by business and IT alike. I just recently deployed a new web services that is called by multiple service clients. Would you call this SOA because it is a web service that can be called by any requesting client? In my opinion, this is not SOA; instead it is Just Another Web Service (JAWS).  Just because a company creates a web service does not mean that they are using SOA, in fact it only means that they are using a web service. SOA is an architectural style that focuses on the design of systems based on the consumer and providers thorough the use of contracts.  With this approach SOA needs to be applied for the top down in order for it to reach its full potential. In the case of the web service, the service is just a small part of the entire system that is reusable and has the flexibility to change. In order for a company in this case to move towards SOA then they need to define business processes that can be shared through the use of reusable software and loose coupling. Once the company’s thought and development process change to address changes in this manner they can start to become more SOA.

    Read the article

< Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >