Search Results

Search found 9935 results on 398 pages for 'pages'.

Page 52/398 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • % style macros not supported in some C++/CLI project property pages under VS2010?

    - by Dave Foster
    We're currently evaluating VS2010 and have upgraded our VS2008 C++/CLI project to the new .vcxproj format. I've noticed that a certain property we had set in the project settings did not get translated properly. Under Configuration Properties - Managed Resources - Resource Logical Name, we used to have (in VS2008) the setting: $(IntDir)\$(RootNamespace).$(InputName).resources which indicated that all .resx files were to compile into OurLib.SomeForm.resources inside of the assembly. (the Debug portion is dropped when assembled) According to MSDN, the $(InputName) macro no longer exists and should be replaced with %(Filename). However, when translating the above line to swap those macros, it does not seem to ever expand. The second .resx file it tries to compile, I get a "LINK : fatal error LNK1316: duplicate managed resource name 'Debug\OurLib.%(Filename).resources". This indicates to me that the % style macros are not being expanded here, at least in this specific property. If we don't set anything in that property, the default behavior seems to be to add the subdirectory as a prefix, such as: OurLib.Forms.SomeForm.resources where Forms is the subdir of our project that the .resx file lives. This only occurs when the .resx file is in an immediate subdirectory of the project being built. If a .resx file exists somewhere else on disk (aka ..\OtherLib\Forms\SomeForm2.resx) this prefix is NOT added. This is causing an issue with loading form resources, as it does not account for this possible prefix, even though we are using the standard Forms Designer method of getting at resources: System::ComponentModel::ComponentResourceManager^ resources = (gcnew System::ComponentModel::ComponentResourceManager(SomeForm::typeid)); and do not specify the .resources file by name. The issue I've just described may not be the same as the original question, but if I were to fix the Resource Logical Name issue I think this would all go away. Does anyone have any information about these % macros and where they are allowed to be used?

    Read the article

  • Iterating anchors in jquery doesn't seem to work...

    - by bala3569
    Hai i am generating page numbers based on currentpage and lastpage using jquery ... Here is my function and as i am newbie i dont know how it can be done... function generatePages(currentPage, LastPage) { if (LastPage <= 5) { var pages = ''; for(var i=1;i<=5;i++) { pages += "<a class='page-numbers' href='#'>" + i + "</a>" } $("#PagerDiv").append(pages); } if (LastPage > 5) { var pages = ''; for (var i = 1; i <= 5; i++) { pages += "<a class='page-numbers' href='#'>" + i + "</a>" } $("#PagerDiv").append(pages); } } I want the result to be like this If it is the first page If it is in the middle, If it is the last page, I have the lastPage and currentPage values please help me out getting this...

    Read the article

  • How can I set QNetworkReply properties to get correct NCBI pages?

    - by Claire Huang
    I try to get this following url using the downloadURL function: http://www.ncbi.nlm.nih.gov/nuccore/27884304 But the data is not as what we can see through the browser. Now I know it's because that I need to give the correct information such as browser, how can I know what kind of information I need to set, and how can I set it? (By setHeader function??) In VC++, we can use CInternetSession and CHttpConnection Object to get the correct information without setting any other detail information, is there any similar way in Qt or other cross-platform C++ network lib?? (Yes, I need the the cross-platform property.) QNetworkReply::NetworkError downloadURL(const QUrl &url, QByteArray &data) { QNetworkAccessManager manager; QNetworkRequest request(url); request.setHeader(QNetworkRequest::ContentTypeHeader ,"Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7 (.NET CLR 3.5.30729)"); QNetworkReply *reply = manager.get(request); QEventLoop loop; QObject::connect(reply, SIGNAL(finished()), &loop, SLOT(quit())); loop.exec(); int direction; QVariant statusCodeV = reply->attribute(QNetworkRequest::RedirectionTargetAttribute); QUrl redirectTo = statusCodeV.toUrl(); if (!redirectTo.isEmpty()) { if (redirectTo.host().isEmpty()) { const QByteArray newaddr = ("http://"+url.host()+redirectTo.encodedPath()).toAscii(); redirectTo.setEncodedUrl(newaddr); redirectTo.setHost(url.host()); } return (downloadURL(redirectTo, data)); } if (reply->error() != QNetworkReply::NoError) { return reply->error(); } data = reply->readAll(); delete reply; return QNetworkReply::NoError; }

    Read the article

  • Development deployment: how to achive edit-and-reload with JSP pages?

    - by doublep
    Out project uses WebLogic as web-server and uses mostly JSP for user interface. With standard setup it is possible to copy edited JSP files into the exploded deployment directory and WebLogic will automatically pick them up, recompile and serve new content through HTTP. However, is it possible to avoid copying at all, so that I just save a file in my editor and it is immediately (well, after a couple of seconds for recompilation) visible? The project uses Apache Ant as building tool. I would imagine what I want would be possible with symlinks (since this is for deployment only I don't care about cross-platformity), but then I don't see how it is possible to symlink lots of files at once with Ant. So, how do I achieve save-JSP-hit-F5-in-browser functionality either with some setting in WebLogic; or with symlinking JSPs using Apache Ant (instead of copying them as is done now); or something else completely?

    Read the article

  • Dynamically loading CSS and JavaScript using Prototype

    - by Salman A
    I have a classic ASP application that I've been constantly trying to modularize. Currently, almost all pages are divided in to two pages: an outer page that contains the layout, header, sidebar, footer an inner page that contains ASP code The outer pages use dreamweaver templates so updating layout and replicating changes is easy. The inner pages are managed by me. Now here is the problem: I had to add a lightbox to one page, I chose Lightbox 2 which requires Prototype. I ended up adding Prototype on every page, assuming that sooner or later I'll upgrade all pages, forms, ajax requests and other javascript to use Prototype. I've now added two other plugins -- Modalbox and Protofade; each with a pair of .JS and .CSS files. Since I'll be using these three plugins on specific set of pages I am wondering if I can load the required CSS and JS files dynamically. I do not want to access the document head and add include files there, I'll have to do this from inside a DIV where all ASP code is supposed to go.

    Read the article

  • How can I track an asp.net pages size without tracing?

    - by Middletone
    I want to be able to track the amount of data that is being transfered from my web site to each user that accesses the site. I can do this for file downloads and such but what about the pure html content itself. How can I track the output size of a page (or the data that's trasnfered via an AJAX call) to the client and log it against a particular users session? Also how would this differ when GZip is used in IIS 6.0?

    Read the article

  • RDLC item width is dynamic and causing extra pages to be generated (image included)?

    - by Paul Mendoza
    I'm trying to format an RDLC report file in Visual Studio 2008 and I am having a formatting issue. I have a list at the bottom that contains a matrix that expands horizontally to the right. That pink box is just to visualize the problem I'm having. When the report is rendered the matrix expands and instead of filling the pink box with the matrix is pushes the space in the pink box to the right resulting in an extra page when printing the reports. One solution would be to shrink the pink box to be the size of the matrix which I've done. But then when the matrix grows the fields at the top of the report get pushed to the right by the same amount as the growth of the matrix. Can someone please let me know what they think the solution would be? Thank you!

    Read the article

  • How do I load ascx pages faster in visual studio 2008?

    - by diadem
    One of my (my team's) biggest peeves with VS2008 is the slow speed in which ascx load. It could take up to a couple minutes to do something as simple as a text or style change simply because of the time it takes to load an ascx page into the visual studio text editor. Half the time I'm tempted to check out the file, edit it in notepad, then check it back in. Is there any trick to speeding this up?

    Read the article

  • Should I remove Etag for htm and php pages?

    - by Castor
    I generate htm files dynamically using php and .htaccess. I read somewhere that I should remove Etags for files of type text/html? Is that correct? I am wondering if I use etags and If i don't change the content, I could save some bandwidth. I would appreciate if you guys could tell me if I can use etags for htm files.

    Read the article

  • What is a better way to convert a simple sinatra app to static html pages?

    - by dimus
    A friend of mine asked to create a static website and I found that making such site using Sinatra is a pure joy. I just wrote all my routes like this: get '/index.html' do haml :index end get '/app.css' do sass :app end .... So I was able to use layouts, and haml and sass to put site together quickly. To create the static site I used wget -r -l2 http://localhost:4567 Which did work pretty well, but I imagine there is a better way to create a static site from a Sinatra code?

    Read the article

  • Best place to check user authenticity in a back end module where all pages are only available to mem

    - by understack
    I've a backend module which could only be accessed by only authorized members. So I need to check authenticity for all actions and for all controllers. Currently I'm doing it inside preDispatch() functions inside controller classes. So it takes care of all the actions inside that controller. But still I've to do it for all controllers. Is there a place I could check it for all the controllers as well. So basically I want one place authenticity check for whole backend module. Can I do it in bootstrap?

    Read the article

  • data between pages: $_SESSION vs. $_GET ?

    - by Haroldo
    Ok, firstly this is not about forms this is about consistent layout as a user explores a site. let me explain: If we imagine a (non-ajax) digital camera online store, say someone was on the DSLR section and specified to view the cameras in Gallery mode and order by price. They then click onto the Compact camera's page. It would be in the users interests if the 'views' they selected we're carried over to this new page. Now, i'd say use a session - am i wrong? are there performance issues i should be aware of for a few small session vars ( ie view=1 , orderby=price) ?

    Read the article

  • Greasemonkey is getting an empty document.body on select Google pages.

    - by Brock Adams
    Hi, I have a Greasemonkey script that processes Google search results. But it's failing in a few instances, when xpath searches (and document body) appear to be empty. Running the code in Firebug's console works every time. It only fails in a Greasemonkey script. Greasemonkey sees an empty document.body. I've boiled the problem down to a test, greasemonkey script, below. I'm using Firefox 3.5.9 and Greasemonkey 0.8.20100408.6 (but earlier versions had the same problem). Problem: Greasemonkey sees an empty document.body. Recipe to Duplicate: Install the Greasemonkey script. Open a new tab or window. Navigate to Google.com (http://www.google.com/). Search on a simple term like "cats". Check Firefox's Error console (Ctrl-shift-J) or Firebug's console. The script will report that document body is empty. Hit refresh. The script will show a good result (document body found). Note that the failure only reliably appears on Google results obtained this way, and on a new tab/window. Turn javascript off globally (javascript.enabled set to false in about:config). Repeat steps 2 thru 5. Only now the Greasemonkey script will work. It seems that Google javascript is killing the DOM tree for greasemonkey, somehow. I've tried a time-delayed retest and even a programmatic refresh; the script still fails to see the document body. Test Script: // // ==UserScript== // @name TROUBLESHOOTING 2 snippets // @namespace http://www.google.com/ // @description For code that has funky misfires and defies standard debugging. // @include http://*/* // ==/UserScript== // function LocalMain (sTitle) { var sUserMessage = ''; //var sRawHtml = unsafeWindow.document.body.innerHTML; //-- unsafeWindow makes no difference. var sRawHtml = document.body.innerHTML; if (sRawHtml) { sRawHtml = sRawHtml.replace (/^\s\s*/, ''). substr (0, 60); sUserMessage = sTitle + ', Doc body = ' + sRawHtml + ' ...'; } else { sUserMessage = sTitle + ', Document body seems empty!'; } if (typeof (console) != "undefined") { console.log (sUserMessage); } else { if (typeof (GM_log) != "undefined") GM_log (sUserMessage); else if (!sRawHtml) alert (sUserMessage); } } LocalMain ('Preload'); window.addEventListener ("load", function() {LocalMain ('After load');}, false);

    Read the article

  • How can I request local pages in the background of an ASP.NET MVC app?

    - by flipdoubt
    My ASP.NET MVC app needs to run a set of tasks at startup and in the background at a regular interval. I have implemented each task as a controller action and listed the app-relative path to the action in the database. I implemented a TaskRunner process that gets the urls from the database and requests each one at a regular interval using WebRequest.Create, but this throws a UriFormatException. I cannot use this answer or any code that plucks values from HttpContext.Current.Request without getting an HttpException with the message "Request is not available in this context". The Request object is not available because my code uses System.Threading.Timer to do background processing, as recommended here. Here are my questions: Is there really no way to make local web requests within an ASP.NET web app? Is there really no way to dynamically ascertain the root path to the web app even using static dependencies in ASP.NET? I was trying to avoid storing the app's root path in the database (as FogBugz does with its "Maintenance Path"), but is this best option?

    Read the article

  • In Spring MVC (2.0) how can you easily hook multiple pages/urls to use 1 controller?

    - by danny
    <!--dispatcher file--> <bean id="urlMapping" class="org.springframework.web.servlet.handler.SimpleUrlHandlerMapping"> <property name="mappings"> <props> <prop key="/foo/bar/baz/boz_a.html">bozController</prop> </props> </property> </bean> <!--mappings file--> <bean id="bozController" class="com.mycompany.foo.bar.baz.BozController"> <property name="viewPathA" value="foo/bar/baz/boz_a" /> <property name="viewPathB" value="foo/bar/baz/boz_b" /> ... <property name="viewPathZ" value="foo/bar/baz/boz_z" /> </bean> how do I set it up so that when the user loads page boz_w.html it uses the bozController, and sets the viewPath to use boz_w.jsp?

    Read the article

  • Rendering "partial" pages in ASP.Net? (without the <html> and such)

    - by Earlz
    Hello, I am trying to make use of jQueryUI AJAX tabs in my ASP.Net Webforms project. I have come up against a wall though. For AJAX, you must render only a partial page(no <html> and such elements) by an external URL. How would you best do this in ASP.Net? aspx files require things like a <html> and <head> tag so those wouldn't work so the only thing that comes to mind is using cumbersome ashx files. Am I just over thinking this? Is there an easier way?

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >