Search Results

Search found 53297 results on 2132 pages for 'web design hero'.

Page 452/2132 | < Previous Page | 448 449 450 451 452 453 454 455 456 457 458 459  | Next Page >

  • Calling web service using javascript error

    - by politrok
    Hi , I try to call webservice using json , when i call to web service without excepted parameters , its work , but when i'm try to send parameter , ive got an error : This is my code: function GetSynchronousJSONResponse(url, postData) { var xmlhttp = null; if (window.XMLHttpRequest) xmlhttp = new XMLHttpRequest(); else if (window.ActiveXObject) { if (new ActiveXObject("Microsoft.XMLHTTP")) xmlhttp = new ActiveXObject("Microsoft.XMLHTTP"); else xmlhttp = new ActiveXObject("Msxml2.XMLHTTP"); } url = url + "?rnd=" + Math.random(); // to be ensure non-cached version xmlhttp.open("POST", url, false); xmlhttp.setRequestHeader("Content-Type", "application/json; charset=utf-8"); xmlhttp.send(postData); var responseText = xmlhttp.responseText; return responseText; } function Test() { var result = GetSynchronousJSONResponse('http://localhost:1517/Mysite/Myservice.asmx/myProc', '{"MyParam":"' + 'test' + '"}'); result = eval('(' + result + ')'); alert(result.d); } This is the error : System.InvalidOperationException: Request format is invalid: application/json; charset=utf-8. at System.Web.Services.Protocols.HttpServerProtocol.ReadParameters() at System.Web.Services.Protocols.WebServiceHandler.CoreProcessRequest What is wrong ? Thanks in advance .

    Read the article

  • Wysiwyg with image copy/paste

    - by jW
    First, I understand that an image cannot be "copied" from a local machine into a website. I understand that it must be uploaded. I am a web programmer, and am familiar with common web wysiwyg tools such as TinyMCE and FCKEditor. My question is if there exists a program or web module or something of the sort that works will perform an automatic upload of images for a wysiwyg. I have a client that is constantly complaining about not being able to copy/paste documents with images from MS Word into a wysiwyg to create content on their website. I have looked into TX Text Control (http://labs.textcontrol.com/) and was looking into a possibly flash wysiwyg that could upload the file automatically behind the scenes. I don't know if this exists, and google did not much help me in my search, so I thought I would ask other coders. I am open to any sort of server technology, or browser requirements. I am looking for some browser based tool instead of an application tool such as Dreamweaver or otherwise. If no good solution to the problem exists, I am willing to accept that at this point. Note: This was a request from a client, and to me it seemed rather unreasonable. I decided to gather community advice instead of just tell the client 'No' and the options here have been extremely helpful and informative in presenting possible solutions.

    Read the article

  • Python web scraping involving HTML tags with attributes

    - by rohanbk
    I'm trying to make a web scraper that will parse a web-page of publications and extract the authors. The skeletal structure of the web-page is the following: <html> <body> <div id="container"> <div id="contents"> <table> <tbody> <tr> <td class="author">####I want whatever is located here ###</td> </tr> </tbody> </table> </div> </div> </body> </html> I've been trying to use BeautifulSoup and lxml thus far to accomplish this task, but I'm not sure how to handle the two div tags and td tag because they have attributes. In addition to this, I'm not sure whether I should rely more on BeautifulSoup or lxml or a combination of both. What should I do? At the moment, my code looks like what is below: import re import urllib2,sys import lxml from lxml import etree from lxml.html.soupparser import fromstring from lxml.etree import tostring from lxml.cssselect import CSSSelector from BeautifulSoup import BeautifulSoup, NavigableString address='http://www.example.com/' html = urllib2.urlopen(address).read() soup = BeautifulSoup(html) html=soup.prettify() html=html.replace('&nbsp', '&#160') html=html.replace('&iacute','&#237') root=fromstring(html) I realize that a lot of the import statements may be redundant, but I just copied whatever I currently had in more source file. EDIT: I suppose that I didn't make this quite clear, but I have multiple tags in page that I want to scrape.

    Read the article

  • Asp.net Crawler Webresponse Operation Timed out.

    - by Leon
    Hi I have built a simple threadpool based web crawler within my web application. Its job is to crawl its own application space and build a Lucene index of every valid web page and their meta content. Here's the problem. When I run the crawler from a debug server instance of Visual Studio Express, and provide the starting instance as the IIS url, it works fine. However, when I do not provide the IIS instance and it takes its own url to start the crawl process(ie. crawling its own domain space), I get hit by operation timed out exception on the Webresponse statement. Could someone please guide me into what I should or should not be doing here? Here is my code for fetching the page. It is executed in the multithreaded environment. private static string GetWebText(string url) { string htmlText = ""; HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url); request.UserAgent = "My Crawler"; using (WebResponse response = request.GetResponse()) { using (Stream stream = response.GetResponseStream()) { using (StreamReader reader = new StreamReader(stream)) { htmlText = reader.ReadToEnd(); } } } return htmlText; } And the following is my stacktrace: at System.Net.HttpWebRequest.GetResponse() at CSharpCrawler.Crawler.GetWebText(String url) in c:\myAppDev\myApp\site\App_Code\CrawlerLibs\Crawler.cs:line 366 at CSharpCrawler.Crawler.CrawlPage(String url, List1 threadCityList) in c:\myAppDev\myApp\site\App_Code\CrawlerLibs\Crawler.cs:line 105 at CSharpCrawler.Crawler.CrawlSiteBuildIndex(String hostUrl, String urlToBeginSearchFrom, List1 threadCityList) in c:\myAppDev\myApp\site\App_Code\CrawlerLibs\Crawler.cs:line 89 at crawler_Default.threadedCrawlSiteBuildIndex(Object threadedCrawlerObj) in c:\myAppDev\myApp\site\crawler\Default.aspx.cs:line 108 at System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(Object state) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean ignoreSyncCtx) at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem() at System.Threading.ThreadPoolWorkQueue.Dispatch() at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback() Thanks and cheers, Leon.

    Read the article

  • Handling 'session expired' in JSF web application, running in JBoss AS 5

    - by Veera
    This question is related to my other question "How to redirect to Login page when Session is expired in Java web application?". Below is what I'm trying to do: I've a JSF web application running on JBoss AS 5 When the user is inactive for, say 15 minutes, I need to log out the user and redirect him to the login page, if he is trying to use the application after the session has expired. So, as suggested in 'JSF Logout and Redirect', I've implemented a filter which checks for the session expired condition and redirects the user to a session-timed-out.jsp page, if the session has expired. I've added SessionExpiryCheckFilter on top of all other filter definitions in web.xml, so that my session expiry check will get the first hit always. Now comes the challenge I'm facing. Since I'm using JBoss AS, when the session expired, JBoss automatically redirects me to the login page (note that the session expiry check filter is not invoked). So, after I log-in, my SessionExpiryCheckFilter intercepts the request, and it sees a session is available. But, it throws the exception javax.faces.application.ViewExpiredException: viewId:/mypage.faces - View /mypage.faces could not be restored. Have anyone faced this issue before? Any ideas to solve this issue?

    Read the article

  • SharePoint Visual web part and Oracle connection problem

    - by Rishi
    Hi, I'm trying to build a "visual web part" for SharePoint 2010 which should connect to Oracle table and display records on SharePoint page.For development, Oracle 11g client (with ODP.net) ,SharePoint server 2010, Visual Studio 2010 and Oracle 10g express all running on my machine. First,I've written sample code in ASP.NET web app to connect my local Oracle table and display data in grid view and it works fine. My code is , OracleConnection con; try { // Connect string constr = "Data Source=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521)))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=XE)));User Id=SYSTEM; Password=password"; con = new OracleConnection(constr); //Open database connection con.Open(); // Execute a SQL SELECT OracleCommand cmd = new OracleCommand("select * from T_ACTIONPOINTS WHERE AP_STATUS='Active' ", con); OracleDataReader dr = cmd.ExecuteReader(); GridView.DataSource = dr; GridView.DataBind(); GridView.AllowPaging = true; } catch (Exception e) { lblError.Text = e.Message; } Now, I'm trying to create new "SharePoint" visual web part project and using same code and deploying it on my local SP server. But when it runs , I get following error here is my solution explorer, It looks something wrong in compatibility.Can someone point me in right direction ?

    Read the article

  • Authenticated WCF: Getting the Current Security Context

    - by bradhe
    I have the following scenario: I have various user's data stored in my database. This data was entered via a web app. We'd like to expose this data back to the user over a web service so that they can integrate their data with their applications. We would also like to expose some business logic over these services. As such we do not want to use OData. This is a multi-tenant application so I only want to expose their data back to them and not other users. Likewise, the business logic we expose should be relative to the authenticated user. I would like let the user use an OASIS scheme to authenticate with the web service -- WCF already allows for this out of the box as far as I understand -- or perhaps we can issue them certificates to authenticate with. That bit hasn't really been worked out yet. Here is a bit of pseudo-code of how I envision this would work within the service: function GetUsersData(id) var user := Lookup User based on Username from Auth Context var data := Get Data From Repository based on "user" return data end function For the business logic scenario I think it would look something like this: function PerformBusinessLogic(someData) var user := Lookup User based on Username from Auth Context var returnValue := Perform some logic based on supplied data return returnValue end function The hard bit here is getting the current username (or cert info in the cert scenario) that the user authenticated with! Does WCF even enable this scenario? If not would WSE3 enable this? Thanks,

    Read the article

  • Avoiding GC thrashing with WSE 3.0 MTOM service

    - by Leon Breedt
    For historical reasons, I have some WSE 3.0 web services that I cannot upgrade to WCF on the server side yet (it is also a substantial amount of work to do so). These web services are being used for file transfers from client to server, using MTOM encoding. This can also not be changed in the short term, for reasons of compatibility. Secondly, they are being called from both Java and .NET, and therefore need to be cross-platform, hence MTOM. How it works is that an "upload" WebMethod is called by the client, sending up a chunk of data at a time, since files being transferred could potentially be gigabytes in size. However, due to not being able to control parts of the stack before the WebMethod is invoked, I cannot control the memory usage patterns of the web service. The problem I am running into is for file sizes from 50MB or so onwards, performance is absolutely killed because of GC, since it appears that WSE 3.0 buffers each chunk received from the client in a new byte[] array, and by the time we've done 50MB we're spending 20-30% of time doing GC. I've played with various chunk sizes, from 16k to 2MB, with no real great difference in results. Smaller chunks are killed by the latency involved with round-tripping, and larger chunks just postpone the slowdown until GC kicks in. Any bright ideas on cutting down on the garbage created by WSE? Can I plug into the pipeline somehow and jury-rig something that has access to the client's request stream and streams it to the WebMethod? I'm aware that it is possible to "stream" responses to the client using WSE (albeit very ugly), but this problem is with requests from the client.

    Read the article

  • Calling webservice via server causes java.net.MalformedURLException: no protocol

    - by Thomas
    I am writing a web-service, which parses an xml file. In the client, I read the whole content of the xml into a String then I give it to the web-service. If I run my web-service with main as a Java-Application (for tests) there is no problem, no error messages. However when I try to call it via the server, I get the following error: java.net.MalformedURLException: no protocol I use the same xml file, the same code (without main), and I just cannot figure out, what the cause of the error can be. here is my code: DOMParser parser=new DOMParser(); try { parser.setFeature("http://xml.org/sax/features/validation", true); parser.setFeature("http://apache.org/xml/features/validation/schema",true); parser.setFeature("http://apache.org/xml/features/validation/dynamic",true); parser.setErrorHandler(new myErrorHandler()); parser.parse(new InputSource(new StringReader(xmlFile))); document=parser.getDocument(); xmlFile is constructed in the client so: String myFile ="C:/test.xml"; File file=new File(myFile); String myString=""; FileInputStream fis=new FileInputStream(file); BufferedInputStream bis=new BufferedInputStream(fis); DataInputStream dis=new DataInputStream(bis); while (dis.available()!=0) { myString=myString+dis.readLine(); } fis.close(); bis.close(); dis.close(); Any suggestions will be appreciated!

    Read the article

  • What is your preferred tool stack for PHP development in the Windows Environment?

    - by Tim Visher
    I have been developing basic web sites for awhile now with some PHP thrown in for getting dynamic stuff done. However, I recently decided that it was time I got my hands a little dirtier so I wanted to start to play with the underpinnings of Wordpress and other such apps. I work on a Mac at home and have been using Coda for most of my editing needs and I love it. Also, to manage the services stack I use MAMP at home. However, I've begun to realize that for heavy PHP and Web work (AJAX, etc.), more is needed. I'm very interested to hear the kinds of tools more experienced web developers use on a day to day basis to manage the entire work flow. I found this article over on developer tutorials and decided to go with XAMPP (for the moment) for managing the services. However, IDE, Source Control, Debugger, etc. are all up for grabs. Anyway, your thoughts are much appreciated. And, if at all possible, try to describe your entire stack with what each tool fulfills for your development process.

    Read the article

  • Disable proxy for entire application?

    - by Brent
    Ever since upgrading to Visual Studio 2010, I'm running into an issue where the first web request of any type (WebRequest, WebClient, etc.) hangs for about 20 seconds before completing. Subsequent calls work quickly. I've narrowed down the problem to a proxy issue. If I manually disable proxy settings, I don't experience this delay: Dim wrq As WebRequest = WebRequest.Create(Url) wrq.Proxy = Nothing What's strange is that there are no proxy settings enabled on this machine in Internet Options. What I'm wondering is if there is a way to disable proxy settings for my entire project in one shot without explicitly disabling as above for every web object. The main reason I want to be able to do this is that I'm trying to use an API (http://code.google.com/p/google-api-for-dotnet/) which uses web requests, but does not provide any way to manually disable proxy settings. I have found some information suggesting that I need to add some proxy information to the app.config file, but I get errors building my program if I make an edits to that file. Can anyone point me in the right direction?

    Read the article

  • Updating a DLL in a Production ASP.NET Web Site bin folder

    - by Josh Stodola
    I want to update a class library (a single DLL file) in a production web application. This web app is pre-compiled (published). I read an answer on StackOverflow (sorry, can't seem to find it anymore because the Search function does not work very well), that led me to believe that I could just paste the new DLL in the bin folder and it would be picked up without problems (this would cause the WP to recycle, which is fine with me because we do not use InProc session state). However, when I tried this, my site blows up and gives a FileLoadException saying that the assembly manifest definition does not match the assembly reference. What in the world is this?! Updating the DLL in Visual Studio and re-deploying the entire site works just fine, but it is a huge pain in the rear. What is the point of having a separate DLL if you have to re-deploy the entire site to implement any changes? Here's the question: How can I update a DLL on a production web site without breaking the app and without re-deploying all of the files?

    Read the article

  • broken SQL 2008 SP1 Web Edition (can not login with SSMS)

    - by gerryLowry
    Scenario: My installation of SQL Server 2008 Web Edition SP1 was working properly. Since I've recently joined Microsoft's Website Spark*, I removed SQL2008 and installed SQL 2008 again using my Website Spark edition and license from the MSDN download site. Next, I updated SQL 2008 to SP1 (this is required because I'm running Windows 2008 Server R2 Web edition). When I launch SSMS (SQL Server Management Studio), "User name" is "myhost\Administrator" and is greyed out so it can not be changed. When I installed my Website Spark version, I did not include "myhost\Administrator" when I was configuring SQL 2008 service accounts. Instead I created an administrator account "myhost\mySQLaccount". ERROR MESSAGE: Connect to Server (X) Cannot connect to (local) Additional information: Login failed for user 'myhost'Admistrator' (Microsoft SQL Server, Error: 18456) I tried to use the SQL Server Configuration Manager to correct this problem but could not find any useful way to fix this issue. How to I fix this problem? Connect to Server ... Server type: Database Engine Server name: (local) Authentication: Windows Authentication Please advise. Thank you. Gerry * http://www.microsoft.com/web/websitespark/default.aspx

    Read the article

  • Advice on a DB that can be uploaded to a website by a smart client for collecting survey feedback

    - by absfabs
    Hello, I'm hoping you can help. I'm looking for a zero config multi-user datbase that my winforms application can easily upload to a webserver folder (together with 1 or 2 classic asp pages) and am looking for some suggestions/recommendations. The idea is that the database will be used to collect feedback entered by people filling in the asp pages. The pages will write to the database using javascript. The database will subsequently be downloaded again for processing once the responses are in. In Summary: It will mostly run in MS Windows environments. I have a modest budget for this and do not mind paying for such a database. No runtime licensing costs. Should be xcopy - Once uploaded to a website folder it should be operational. It should not have a dotnet CLR dependency. It should support a resonable level of concurrent access. Average respondent count would be around 20-30 but one never knows. Should be a reasonable size so that uploads/downloads to and from the site will be reasonably fast. Would appreciate your suggestions/comments Many thanks Abz To clarify - this is a desktop commercial application for feedback management in a vertical market. It uses SQL Server as the backing store. The application currently provides feedback management from email and paper feedback. I now want to add web feedback capability. Getting users to to make their SQL servers accessible to a website is not at option at this time as I am want to make getting up and running as painless as possible. I intend to release a web based implementation of the software in the near future but for now am looking at the above as a pragmatic way to provide web based feedback collection.

    Read the article

  • Difference between KeywordQuery, FullTextQuerySearch type for Object Model and Web service Query

    - by Raghu
    Initially I believed these 3 to be doing more or less the same thing with just the notation being different. Until recently, when i noticed that their does exists a big difference between the results of the KeyWordQuery/FullTextQuerySearch and Web service Query. I used both KeywordQuery and FullText method to search of the the value of a customColumn XYZ with value (ASDSADA-21312ASD-ASDASD):- When I run this query as:- FullTextSqlQuery:- FullTextSqlQuery myQuery = new FullTextSqlQuery(site); { // Construct query text String queryText = "Select title, path, author, isdocument from scope() where freetext('ASDSADA-21312ASD-ASDASD') "; myQuery.QueryText = queryText; myQuery.ResultTypes = ResultType.RelevantResults; }; // execute the query and load the results into a datatable ResultTableCollection queryResults = myQuery.Execute(); ResultTable resultTable = queryResults[ResultType.RelevantResults]; // Load table with results DataTable queryDataTable = new DataTable(); queryDataTable.Load(resultTable, LoadOption.OverwriteChanges); I get the following result representing the document. * Title: TestPDF * path: http://SharepointServer/Shared Documents/Forms/DispForm.aspx?ID=94 * author: null * isDocument: false Do note the Path and isDocument fields of the above result. Web Service Method Then I tried a Web Service Query method. I used Sharepoint Search Service Tool available at http://sharepointsearchserv.codeplex.com/ and ran the same query i.e. Select title, path, author, isdocument from scope() where freetext('ASDSADA-21312ASD-ASDASD'). This time I got the following results:- * Title: TestPDF * path: http://SharepointServer/Shared Documents/TestPDF.pdf * author: null * isDocument: true Again note the path. While the search results from 2nd method are useful as they provide me the file path exactly, I can't seem to understand why is the method 1 not giving me the same results? Why is there a discrepancy between the two results?

    Read the article

  • Splitting EJBs and interfaces into separate module -- deployment fails

    - by Hank
    I'm having trouble following this guide to "extract" my interfaces and entities from my EAR to use them from another Web Application: I use NetBeans 6.8 and Glassfish 3.0.1 "Java Class Library" project contains all the entities and interfaces "Java EE Application" project class library added to the project, is packaged into the EAR contains EJB implementations, MDBs, Test "Java Web Application" project class library added to the project, is packaged into the WAR contains REST interface When I build and deploy the Web Application, all goes well. When I build the JEE application, I can see the jar-file (interfaces, entities) being included. But when I try to deploy the EAR, Glassfish refuses it with a java.lang.NoClassDefFoundError error: [#|2010-03-28T18:25:59.875+0200|WARNING|glassfishv3.0|javax.enterprise.system.tools.deployment.org.glassfish.deployment.common|_ThreadID=28;_ThreadName=Thread-1;|Error in annotation processing: java.lang.NoClassDefFoundError: mvs/core/StoreServiceLocal|#] [#|2010-03-28T18:25:59.876+0200|SEVERE|glassfishv3.0|javax.enterprise.system.core.com.sun.enterprise.v3.server|_ThreadID=28;_ThreadName=Thread-1;|Exception while deploying the app java.lang.IllegalArgumentException: Invalid ejb jar [CoreServer]: it contains zero ejb. Note: 1. A valid ejb jar requires at least one session, entity (1.x/2.x style), or message-driven bean. 2. EJB3+ entity beans (@Entity) are POJOs and please package them as library jar. 3. If the jar file contains valid EJBs which are annotated with EJB component level annotations (@Stateless, @Stateful, @MessageDriven, @Singleton), please check server.log to see whether the annotations were processed properly. 'mvs/core/StoreServiceLocal' is an interface which is defined in the library jar file. What am I doing wrong?

    Read the article

  • How do I build a JSON object to send to an AJAX WebService?

    - by Ben McCormack
    After trying to format my JSON data by hand in javascript and failing miserably, I realized there's probably a better way. Here's what the code for the web service method and relevant classes looks like in C#: [WebMethod] public Response ValidateAddress(Request request) { return new test_AddressValidation().GenerateResponse( test_AddressValidation.ResponseType.Ambiguous); } ... public class Request { public Address Address; } public class Address { public string Address1; public string Address2; public string City; public string State; public string Zip; public AddressClassification AddressClassification; } public class AddressClassification { public int Code; public string Description; } The web service works great with using SOAP/XML, but I can't seem to get a valid response using javascript and jQuery because the message I get back from the server has a problem with my hand-coded JSON. I can't use the jQuery getJSON function because the request requires HTTP POST, so I'm using the lower-level ajax function instead: $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "http://bmccorm-xp/HBUpsAddressValidation/AddressValidation.asmx/ValidateAddress", data: "{\"Address\":{\"Address1\":\"123 Main Street\",\"Address2\":null,\"City\":\"New York\",\"State\":\"NY\",\"Zip\":\"10000\",\"AddressClassification\":null}}", dataType: "json", success: function(response){ alert(response); } }) The ajax function is submitting everything specified in data:, which is where my problem is. How do I build a properly formatted JSON object in javascript so I can plug it in to my ajax call like so: data: theRequest I'll eventually be pulling data out of text inputs in forms, but for now hard-coded test data is fine. How do I build a properly formatted JSON object to send to the web service?

    Read the article

  • Display new image on web page

    - by RuiT
    Hello, I'm uploading images to a folder and I want to display each new image on a web page every time it is saved in the folder. I can display the initial image, which is already in the folder, and I can also detect when a new image is saved into the folder but I don't know how to display the new images. I'm new to web development. Can somebody help me? Here's the code: public partial class _Default : System.Web.UI.Page { string DirectoryPath = "C:\\Users\\Desktop\\PhotoUpload\\Uploads\\"; protected void Page_Load(object sender, EventArgs e) { FileSystemWatcher watcher = new FileSystemWatcher(); try { watcher.Path = DirectoryPath; watcher.Created += new FileSystemEventHandler(watcher_Created); watcher.EnableRaisingEvents = true; Image image = Image.FromFile(DirectoryPath + "inicial.jpg"); MemoryStream ms = new MemoryStream(); image.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg); byte[] im = ms.ToArray(); Context.Response.ContentType = "Image/JPG"; Context.Response.BinaryWrite(im); } catch (Exception ex) { } } void watcher_Created(object sender, FileSystemEventArgs e) { Console.WriteLine("File Created: Name: " + e.Name); try { //How to display new image? } catch (Exception ex) { } } }

    Read the article

  • Web Grid, Client side Binding VS. Server side HTML generation

    - by Ron Harlev
    I'm working on replacing an existing web grid in an ASP.NET web application, with a new implementation. The existing grid is powerful, but not flexible enough. It also brings with it all kind of frameworks we don't like to have on our web pages. While looking into existing options I noticed I can break the available solutions into two main approaches. The older approach is represented best by the ASP.NET GridView. This is a classic ASP.NET control that generates the needed HTML on the server, based on a given set of data. The newer approach is depending on client side rendering, mainly with jQuery. A good example would be jqGrid. Only the data is sent to the client (Usually with JSON or XML) In the GridView case, if I want an AJAX behavior, I would have to implement it with something like an update panel. Is there a definitive choice I should make? Is there a good chance of achieving the same snappy behavior I get with jqGrid (even with many records), with server side rendered controls? Is there some hybrid implementation incorporating both approaches?

    Read the article

  • Open the Word Application from a button on a web page

    - by Andrea
    I'm developing a proof of concept web application: A web page with a button that opens the Word Application installed on the user's PC. I'm stuck with a C# project in Visual Studio 2008 Express (Windows XP client, LAMP server). I've followed the Writing an ActiveX Control in .NET tutorial and after some tuning it worked fine. Then I added my button for opening Word. The problem is that I can reference the Microsoft.Office.Interop.Word from the project, but I'm not able to access it from the web page. The error says "That assembly does not allow partially trusted callers". I've read a lot about security in .NET, but I'm totally lost now. Disclaimer: I'm into .NET since 4 days ago. I've tried to work around this issue but I cannot see the light!! I don't even know if it will ever be possible! using System; using System.Collections.Generic; using System.ComponentModel; using System.Drawing; using System.Data; using System.Linq; using System.Text; using System.Windows.Forms; using Word = Microsoft.Office.Interop.Word; using System.IO; using System.Security.Permissions; using System.Security; [assembly: AllowPartiallyTrustedCallers] namespace OfficeAutomation { public partial class UserControl1 : UserControl { public UserControl1() { InitializeComponent(); } private void openWord_Click(object sender, EventArgs e) { try { Word.Application Word_App = null; Word_App = new Word.Application(); Word_App.Visible = true; } catch (Exception exc) { MessageBox.Show("Can't open Word application (" + exc.ToString() + ")"); } } } }

    Read the article

  • Advantages/Disadvantages of AIR vs Flex/Web

    - by Lizzan
    Hi all, I'm tasked with writing an application for placing and connecting objects (sort of like a room planner where you can place furniture). I've made a demo using Flash Builder 4 and built it for AIR as a desktop app. Now the client wants the full app, but they and I am unsure whether to continue building it as an AIR app or transform it to a web application using Flex. I tried making a simple conversion of the AIR app to a web app, and most things worked but not all. The things that don't work seem to be simple bugs, though, not complete lack of capability. The capabilities that I'm going to need (except for the modelling) are: Printing of the finished image + a list of the furniture that has been placed A way to save and retrieve finished plans A way to export the list of furniture to Excel format Handling a whole slew of data about the different objects Only the printing has been implemented so far, and seems to work in the web app as well. What advantages/disadvantages are there with the two approaches? Are any of the capabilities I need much worse (or even impossible) to implement in either approach?

    Read the article

  • Long-held TCP sessions in an ASMX client

    - by John
    Hi, I have an ASP.NET application which talks to a third-party SOAP web service. My application uses an ASMX client proxy (i.e. System.Web.Services.Protocols.SoapHttpClientProtocol). The third-party service uses WCF, although I don't expect that makes much difference. I should note that we're using .NET 3.5 SP1. We haven't customised the proxy or done anything unusual - we're just making standard web service requests and getting back the results. We have encapsulated the proxy reference within a using block so it will get disposed after the response is received. We've been told that our application is behaving strangely in its use of TCP sessions. Instead of opening a new TCP session for each request from a new proxy instance (which is what I would have expected it to do), it's apparently keeping several connections alive and re-using them. This is causing some issues at the third party end, as they are expecting us to be using multiple sessions. Is this a known behaviour for the SoapHttpClientProtocol client proxy? If so, is there any way we can override it so that each request results in a new TCP session? Thanks, John

    Read the article

  • Refreshing WEB-INF/lib in Google App Engine (with Eclipse)

    - by Adrian Petrescu
    Hi, I've created a new Google App Engine project within Eclipse. I copied several JARs that I need for my application into the WEB-INF/lib directory, and add them to the build path. I make some random calls to these JARs from within the handler, deploy, and everything works fine. However, if I then change one of the JARs outside the project, and copy the new version to WEB-INF/lib (with the same name) and re-deploy, it doesn't seem to be sending the new JAR; everything is still linking to the old one even though it's not even in my WEB-INF/lib anymore. I'm guessing it's being cached by the server or Eclipse is not even realizing something has changed in order to upload the new version. If I just create a new project with the new JAR, everything is fine again (until I have to make another change...) but of course I don't want to have to create a new project for every change to a dependency I make. My question is, how can I make GAE re-upload all the JARs I have from within Eclipse? Thanks in advance, guys :) -Adrian

    Read the article

  • Add webservice reference, the classes in different file

    - by ArunDhaJ
    Hi, When I add webservice as service reference in my .Net project, it creates a service folder within "Service References". All the interfaces and classes are contained within that service folder. I wanted to know how to split the interface's method and classes. Actually I wanted the web service reference to import classes defined in different dll. I wanted to define this way because of my design constraints. I've 3 layered application. Of which third layer is communication layer which holds all web service references. second is business layer and first is presentation layer. If the class declarations are in layer-3 and I'm accessing those classes from presentation layer, it is logically a cross-layer-access-violation. Instead, I wanted a separate project which holds only the class declarations and this would be used in all 3 layers. I didn't faced any problems to achieve this with layer-1 and layer-2. But, I'm not sure how to make communication layer to use this common declaration dll. Your suggestion would help me to design my application better. Thanks in advance Regards ArunDhaJ

    Read the article

  • Obfuscating ASP.Net dll breaks web application.

    - by uriDium
    I wouldn't usually bother to obfuscate a web application DLL but right now I have to share some server space with someone who might have a conflict of interest and might be tempted to steal the deal and decompile it. Not an ideal solution I know but hey. So I am using VS 2005, a web deployment project (which compiles into a single DLL) and Dotfuscator community edition. When I obfuscate the DLL the web application breaks and I get some message like Could not load type 'Browse' from assembly MyAssembly So I searched around and found that if I disable renaming then it should fix it. Which it does. But now when I look at the DLL using .Net reflector I can see everything again. So this seems kind of pointless. Is there a way to get this to work? Is there a better way to protect my DLL from someone I have to share a server with? UPDATE: I figured out my problem. All the classnames have changed but now all my <%@ Page Language="C#" AutoEventWireup="true" CodeFile="mycode.aspx.cs" Inherits="mycode" % is incorrect because mycode no longer exists. It is now aef or something. Is there any tool out there that will also change the names of the Codefile and Inherits tags?

    Read the article

< Previous Page | 448 449 450 451 452 453 454 455 456 457 458 459  | Next Page >