Search Results

Search found 37094 results on 1484 pages for 'mathieu page'.

Page 58/1484 | < Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >

  • page posting issue when working in Screen Scraping

    - by Muhammad Akhtar
    Hi, I am working on screen scraping and done successfully in 3 websites, I have an issue in last website here is my url, When I hit with my parameter, it is showing result on next page, simply posting to other page and showing the result fine on other page Here is My Test However, when I hit from my application, since here I don't have an option to post, it only fetch html of requested page that is obviously my above mention HTML test link, that actually have parameter in URL to get the result. How can I handle this situtation? Please give me hint. Thanks here is my C# code, I am using HTMLAgality String url; HtmlWeb hw = new HtmlWeb(); HtmlDocument doc; url = "http://mysampleURL"; doc = hw.Load(url);

    Read the article

  • Javascript - find swfobject on included page and call javascript function

    - by Rob
    I’m using the following script on my website to play an mp3 in flash. To instantiate the flash object I use the swfobject framework in a javascript function. When the function is called the player is created and added to the page. The rest of the website is in php and the page calling this script is being included with the php include function. All the other used scripts are in the php 'master'-page var playerMp3 = new SWFObject("scripts/player.swf","myplayer1","0","0","0"); playerMp3.addVariable("file","track.mp3"); playerMp3.addVariable("icons","false"); playerMp3.write("player1"); var player1 = document.getElementById("myplayer1"); var status1 = $("#status1"); $("#play1").click(function(){ player1.sendEvent("play","true"); $("#status1").fadeIn(400); player4.sendEvent("stop","false"); $("#status4").fadeOut(400); player3.sendEvent("stop","false"); $("#status3").fadeOut(400); player2.sendEvent("stop","false"); $("#status2").fadeOut(400); }); $("#stop1").click(function(){ player1.sendEvent("stop","false"); $("#status1").fadeOut(400); }); $(".closeOver").click(function(){ player1.sendEvent("stop","false"); $("#status1").fadeOut(400); }); $(".accordionButton2").click(function(){ player1.sendEvent("stop","false"); $("#status1").fadeOut(400); }); $(".accordionButton3").click(function(){ player1.sendEvent("stop","false"); $("#status1").fadeOut(400); }); $(".turnOffMusic").click(function(){ player1.sendEvent("stop","false"); $("#status1").fadeOut(400); }); }); I have a play-button with the id ‘#play1’ and a stop-button with the id ‘#stop1’ on my page. A div on the same page has the id ‘#status1’ and a little image of a speaker is in the div. When you push the playbutton, the div with the speaker is fading in and when you push the stopbutton, the div with the speaker is fading out, very simple. And it works as I want it to do. But the problem is, when a song is finished, the speaker doesn’t fade out. Is there a simple solution for this? I already tried using the swfobject framework to get the flash player from the page and call the ‘IsPlaying’ on it, but I’m getting the error that ‘swfobject’ can’t be found. All I need is a little push in the right direction or an example showing me how I can correctly get the currently playing audio player (in flash), check if it’s playing and if finished, call a javascript function to led the speaker-image fade-out again. Hope someone here can help me

    Read the article

  • jQuery partial page refresh after form submit

    - by heeboir
    When using jQuery to submit a form is it possible to place the resulting page (after the submit) inside another HTML element? I'll try to make this clearer. Up to now I've been using Callback methods that among others do a document.forms['form'].submit(); whenever a form has been updated with new information and needs to be refreshed. However, this results in a full page refresh. I'm implementing Partial Page Refresh using jQuery and thought of using something like var newContent = jQuery('#form').submit(); jQuery('#div').load(newContent); However, that does not seem to work as the page is still fully refreshed. The content is correct, however the behaviour seems to be exactly the same as before - so I'm not really sure if what I want is actually possible with jQuery. Any hints and pointers would be helpful. Thanks.

    Read the article

  • How to disable maximize button in a WPF appication page not in a window

    - by Lukman
    I want to disable the maximize button in WPF application page, not in a WPF application window When I searched in Google, it is available methods for disabling the maximize button in WPF window The code snippet is as follows Window.WindowStyle= WindowStyle.None But is not working in WPF page It is showing errors. In WPF page there is WindowStyleProperty instead of WindowStyle and it is showing errors as Error 1:A static readonly field cannot be assigned to (except in a static constructor or a variable initializer) Error 2: Cannot implicitly convert type 'System.Windows.WindowStyle' to 'System.Windows.DependencyProperty' So how can I disable the maximize button in WPF page Then how can I implement it Suggest me any sample code snippet.... Advance thanks.........

    Read the article

  • No page number for divider pages in LaTeX

    - by joec
    I have a report in LaTeX, and i have used the following commands to create my Appendix, however, my lecturer states that any divider pages should be unnumbered. \documentclass{report} \usepackage{appendix} \begin{document} \include{chap1} \include{appendix} \end{document} Then in appendix.tex \appendix \pagestyle{empty} \appendixpage \noappendicestocpagenum \addappheadtotoc This creates the Appendices divider page, but still puts a page number on it in the footer. There is no page number in the TOC, as expected. How can i remove it from the footer? Thanks

    Read the article

  • Apache mod-pagespeed installation affects mod-spdy?

    - by tim peterson
    Recently my site (an https connection, running on an Amazon EC2 ubuntu apache2.2) has this issue where I need to load the page several times (3-4) before it will load normally without issue. It will then load normally as long as I keep loading pages regularly (every couple seconds). It will stall again if I don't load pages for a few minutes. It has nothing to do with my application because I don't have this problem with the exact same app codebase on my Apache installation on my laptop. The only things to my knowledge that I've changed is that I recently installed mod_spdy and then a few weeks later I installed mod_pagespeed, https://developers.google.com/speed/pagespeed/mod. However, I have since turned mod_pagespeed off by setting its pagespeed.conf to mod_pagespeed off. Unfortunately, that didn't solve the problem. The line below is how every of last 10 lines of my error.log look: # tail -f /var/log/apache2/error.log ... [32728:32729:ERROR:mod_spdy.cc(162)] request->chunked == 1 in request GET / HTTP/1.1 [Sat Jun 02 04:50:08 2012] [warn] [client 50.136.93.153] [stream 5] [32728:32729:WARNING:http_to_spdy_filter.cc(113)] HttpToSpdyFilter is not the last filter in the chain: chunk any thoughts? thank you, tim

    Read the article

  • mod_rewrite > /user/tag1/tag2/tag..?page=1

    - by user293479
    I am trying to use mod_rewrite to pretty up a URL. I want the URL to look like this: http://example.com/bart/school?page=2 and the rewritten URL to be: http://localhost:8080/app?user=bart&tag1=school&page=2 If possible, I would also like to be able to have more than one tag per user: http://example.com/bart/school/lisa?page=2 Would look like: http://localhost:8080/app?user=bart&tag1=school&tag2=lisa&page=2 I far as I can tell this is possible by using mod_rewrite but I can't seem to figure it out. Any help would be really appreciated!

    Read the article

  • Report group headings not repeating on every page.

    - by ProfK
    I have an RDLC report with three tables and associated data sets. In my second table, I cannot get the two 'header' rows to repeat on each printed page. When viewed interactively, each table is on its own page and this isn't a problem. When I switch to print layout, e.g. my second table now spans two pages, and the second page gets no header rows. Am I missing a setting or something? ADDED: I do have the 'Repeat Header columns on each page' checked.

    Read the article

  • Infinite loop using Spring Security - Login page is protected even though it should allow anonymous

    - by Tai Squared
    I have a Spring application (Spring version 2.5.6.SEC01, Spring Security version 2.0.5) with the following setup: web.xml <welcome-file-list> <welcome-file> index.jsp </welcome-file> </welcome-file-list> The index.jsp page is in the WebContent directory and simply contains a redirect: <c:redirect url="/login.htm"/> In the appname-servlet.xml, there is a view resolver to point to the jsp pages in WEB-INF/jsp <bean id="viewResolver" class="org.springframework.web.servlet.view.InternalResourceViewResolver"> <property name="viewClass" value="org.springframework.web.servlet.view.JstlView" /> <property name="prefix" value="/WEB-INF/jsp/" /> <property name="suffix" value=".jsp" /> </bean> In the security-config.xml file, I have the following configuration: <http> <!-- Restrict URLs based on role --> <intercept-url pattern="/WEB-INF/jsp/login.jsp*" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/WEB-INF/jsp/header.jsp*" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/WEB-INF/jsp/footer.jsp*" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/login*" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/index.jsp" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/logoutSuccess*" access="ROLE_ANONYMOUS" /> <intercept-url pattern="/css/**" filters="none" /> <intercept-url pattern="/images/**" filters="none" /> <intercept-url pattern="/**" access="ROLE_ANONYMOUS" /> <form-login login-page="/login.jsp"/> </http> <authentication-provider> <jdbc-user-service data-source-ref="dataSource" /> </authentication-provider> However, I can't even navigate to the login page and get the following error in the log: WARNING: The login page is being protected by the filter chain, but you don't appear to have anonymous authentication enabled. This is almost certainly an error. I've tried changing the ROLE_ANONYMOUS to IS_AUTHENTICATED_ANONYMOUSLY, changing the login-page to index.jsp, login.htm, and adding different intercept-url values, but I can't get it so the login page is accesible and security applies to the other pages. What do I have to change to avoid this loop?

    Read the article

  • redirect http to https for some page in site in APACHE

    - by Avinash
    Hi I want to one of my site's page will use only HTTPS. i have given manually link to all sites to https. But i want that if user manually types that page url with http then it should be redirected to https page. So if user types. http://mydomain.com/application.php then it should be redirected to https://mydomain.com/application.php Thanks Avinash

    Read the article

  • Add a wall post to a page or application wall as page or application with facebook graph API

    - by blauesocke
    Hi, I wan't to create a new wall post on a appliaction page or a "normal" page with the facebook graph API. Is there a way to "post as page"? With the old REST-API it worked like this: $facebook->api_client->stream_publish($message, NULL, $links, $targetPageId, $asPageId); So, if I passed equal IDs for $targetPageId and $asPageId I was able to post a "real" wall post not caused by my own facebook account. Thanks!

    Read the article

  • To Ajax or Not to Ajax a listing page

    - by kaivalya
    Here i am talking about product listing pages where there are multiple filters that filter the list of products appearing on the page like product types, categories price range etc. I have done such pages using both ajax and no ajax way in the past. What I like about using ajax in such page is that, when filters are selected I only update the section that contains the product list. There is no need to refresh the whole page which could end up re-loading the images on top bar, banners etc and slow down the user performance. Ajax way in my opinion becomes more compact and responsive from user experience. Down side for ajax route for me is; since filter states are not maintained in the URL I end up maintaining them on the server. This becomes complicated if I want to handle multi window scenarios and it is also costly to maintain such state on server memory for each session. Not using ajax and simply keeping all filter values on url and refreshing the page is quite simple but the luxury of refreshing only the pane that really needs to be refreshed is lost. Lately I am seeing a lot of large scale e-commerce sites that are using non-ajax approach on their listing pages and this is making me question one more time if it might be more efficient to build non-ajax listing make due to the long term maintenance ease and sacrifice a little bit from user experience. I am about to start implementing a new listing page for a product which I have the flexibility to go either way and I would appreciate your inputs.

    Read the article

  • "Authorize" attribute and 403 error page

    - by zerkms
    [Authorize] property is nice and handy MS invention, and I hope it can solve the issues I have now To be more specific: When current client isn't authenticated - [Authorize] redirects from secured action to logon page and after logon was successfull - brings user back, this is good. But when current cilent already authenticated but not authorized to run specific action - all I need is to just display my general 403 page. Is it possible without moving authorization logic within controller's body? UPD: The behavior I need in should be semantically equals to this sketch: public ActionResult DoWork() { if (!NotAuthorized()) { return RedirectToAction("403"); } return View(); } so - there should no any redirect and url should be stay the same, but contents of the page should be replaced with 403-page

    Read the article

  • Redirecting to a new page

    - by Pankaj Khurana
    Hi, I have defined a function in which i want to open a page on a new window. The function has following code: echo"<script>window.location.href='http://localhost/paymentsystem/views/payment.php?id=".$id."'</script>" ; Right now its opening the page on the same window. I want to know how can i open this page on a new tab? Regards, Pankaj

    Read the article

  • Internet Explorer 8 timeout too quick on page POSTs

    - by cdm9002
    We have an asp.net site running, which has been working fine for some time, but recently I have been experiencing some issues with IE8. On posting some pages - mainly on our development server, although on staging too - we get an occasional "Internet Explore cannot display the webpage" error along with the button asking to diagnose connection problems. IE only seems to wait 10 seconds before timing out. I know that the page itself may take longer to load the first time (on dev and staging). So press F5 and everything then works fine. Is there anything that should be done in the aspx page to tell IE to wait a bit longer? I thought I had read that the default timeout supposed to be 90 seconds or something for browsers. A bit more info: It mostly happens on a POSTing a signup page, but that is just because I test that page and it starts the IIS App, makes the first connection to SQL and pre-caches some information. That first time the page can take 10-15 seconds to come back. IE8 times out after 10 seconds as it has had nothing back. This happens on a dev W7x64 machine with 8GB RAM, as well as on a staging server WIN2008. Having googled around a bit, some people are seeing the same problem, but no conclusive pointers to the problem or a solution. It isn't a connection problem; everything works fine in Firefox, Chrome and even IE7; I have tried with add-ons disabled and resetting IE settings, still happens. Ideas welcome.

    Read the article

  • Scrollbar within a scrollbar only make the inner scrollbar jump to id

    - by Nik
    I have a page that requires a scrollbar - http://www.aus-media.com/dev/site_BYJ/schedule-pricing/pricing.html You will notice (unless you are running at a high resolution with a big screen) that there is an outer scrollbar for the main page as well as an inner scrollbar for the content. When you click on one of the sub items e.g. payments you will notice that the outer page scrolls down as well as the inner scrollbar. Does anybody know of a way to only scroll the inner scrollbar, so only it jumps to the id? Thanks guys Nik

    Read the article

  • How to iterate over all the page breaks in an Excel 2003 worksheet via COM

    - by Martin
    I've been trying to retrieve the locations of all the page breaks on a given Excel 2003 worksheet over COM. Here's an example of the kind of thing I'm trying to do: Excel::HPageBreaksPtr pHPageBreaks = pSheet->GetHPageBreaks(); long count = pHPageBreaks->Count; for (long i=0; i < count; ++i) { Excel::HPageBreakPtr pHPageBreak = pHPageBreaks->GetItem(i+1); Excel::RangePtr pLocation = pHPageBreak->GetLocation(); printf("Page break at row %d\n", pLocation->Row); pLocation.Release(); pHPageBreak.Release(); } pHPageBreaks.Release(); I expect this to print out the row numbers of each of the horizontal page breaks in pSheet. The problem I'm having is that although count correctly indicates the number of page breaks in the worksheet, I can only ever seem to retrieve the first one. On the second run through the loop, calling pHPageBreaks->GetItem(i) throws an exception, with error number 0x8002000b, "invalid index". Attempting to use pHPageBreaks->Get_NewEnum() to get an enumerator to iterate over the collection also fails with the same error, immediately on the call to Get_NewEnum(). I've looked around for a solution, and the closest thing I've found so far is http://support.microsoft.com/kb/210663/en-us. I have tried activating various cells beyond the page breaks, including the cells just beyond the range to be printed, as well as the lower-right cell (IV65536), but it didn't help. If somebody can tell me how to get Excel to return the locations of all of the page breaks in a sheet, that would be awesome! Thank you. @Joel: Yes, I have tried displaying the user interface, and then setting ScreenUpdating to true - it produced the same results. Also, I have since tried combinations of setting pSheet->PrintArea to the entire worksheet and/or calling pSheet->ResetAllPageBreaks() before my call to get the HPageBreaks collection, which didn't help either. @Joel: I've used pSheet->UsedRange to determine the row to scroll past, and Excel does scroll past all the horizontal breaks, but I'm still having the same issue when I try to access the second one. Unfortunately, switching to Excel 2007 did not help either.

    Read the article

  • Is it possible to create a generic Util Function to be used in Eval Page

    - by Nassign
    I am currently binding a Nullable bit column to a listview control. When you declare a list view item I need to handle the case when the null value is used instead of just true or false. <asp:Checkbox ID="Chk1" runat="server" Checked='<%# HandleNullableBool(Eval("IsUsed")) %>' /> Then in the page I add a HandleNullableBool() function inside the ASPX page. protected static bool HandleNullableBool(object value) { return (value == null) ? false : (bool)value; } This works fine but I need to use this in several pages so I tried creating a utility class with a static HandleNullableBool. But using it in the asp page does not work. Is there a way to do this in another class instead of the ASPX page? <asp:Checkbox ID="Chk1" runat="server" Checked='<%# Util.HandleNullableBool(Eval("IsUsed")) %>' />

    Read the article

  • IE7 - visited links revert to unvisted after page refresh

    - by Gerald
    Hello, A number of our users have just upgraded from IE6 to IE7. the upgreaded users are reporting an issue with visited links reverting to their unvisited color after a page refresh. This only happens to links that are using javascript instead of a hard coded URL: <script lang="JavaScript"> <!-- function LoadGoogle() { var LoadGoogle = window.open('http://www.google.com'); } --> </script> <a href="javascript:LoadGoogle()">Google using javascript</a> <a href="#" OnClick="javascript:LoadGoogle()">Google using javascript OnClick</a> The above links will revert back to the unvisited color whenever the page is refreshed. It doesn't matter if the page is refreshed because of a post back, manually hitting the refresh or f5 button, or from an auto-refresh function. Please note, the above code is an over simplification of what is actually happening, but I believe it illustrates the issue well enough. This is causing a problem for our users because we are providing them with a list of items that are all opened into new windows via javascript when they are clicked; and refresh the parent page when the users are finished with them. Each time the parent page is refreshed all of these links revert back to their unvisited color, so our users are losing track of which items they've worked on. I've been digging around and it looks like this is intended behavior. IE7 doesn't register these links with the browsers history. Does anyone know a work around that will allow us to keep these javascript links in the visited state without having to do a major overhaul of the apps code? Thank you.

    Read the article

  • LaTeX: remove blank page after a \part or \chapter

    - by CaptSolo
    How to remove a blank page that gets added automatically after \part{} or \chapter{} in a book document class? I need to add some short text describing the \part. Adding some text after the part command results in at least 3 pages with an empty page between the part heading and the text: Part xx (empty) some text How to get rid of that empty page? P.S. Latex: How to remove blank pages coming between two chapters IN Appendix? is similar but it changes the behavior for the rest of the text while I need to remove the empty page for this one \part command only.

    Read the article

  • Get page permalink and title outside the loop in wordpress

    - by Aakash Chakravarthy
    Hello, How to Get page permalink and title outside the loop in wordpress. I have a function like function get_post_info(){ $post; $permalink = get_permalink($post->ID); $title = get_the_title($post->ID); return $post_info('url' => $permalink, 'title' => $title); } when this function called within the loop, it returns the post's title and url. When it is called outside the loop. It is not returning the current page's title and url. When called in home page it should return the home page's title and url How to get like this ? instead this function returns the latest posts title and url

    Read the article

  • Dynamically create controls using stringbuilder

    - by Shrewdy
    hi, i have been trying to create controls dynamically on my web page using the StringBuilder class..and i dont quite seem to get through... any help would be appreciated. i am trying to do this... StringBuilder sbTest = new StringBuilder(string.Empty); sbTest.Append("<input type=\"text\" id=\"txt1\" runat=\"server\" />"); Response.Write(sbTest.ToString()); The page for sure displays a TextBox on the browser which is easily accessible through JavaScript...but what i want is the control to be available on the Server Side too...so that when the page is posted back to the server i can easliy obtain the value that has been entered by the user into the textbox. Can any 1 please help me with this.... thank you so much....

    Read the article

< Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >