Search Results

Search found 5851 results on 235 pages for 'binary compatibility'.

Page 60/235 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • Problem with uninstalling Microsoft .NET Framework 4 Extended Beta 2 on Windows Vista

    - by empi
    Hi. I have a problem with uninstalling Microsoft .NET Framework 4 Extended Beta 2. I wanted to uninstall it but I cancelled the process. Then I was asked if there was a problem without uninstallation if I want to change to compatibility mode. I accidentally chose to change to compatibility mode. Since then, every time I try to uninstall it, I get an error that the installer cannot run in compatibility mode. How can I fix it? I look for installer file and it's not marked to run in compatibility mode. I cannot find the file that was marked to run in compatibility mode after answering mentioned question. Thanks in advance for help.

    Read the article

  • SQL Server 2005 to 2008 upgrade - are MDF files binary compatible?

    - by james
    I have 50 databases on a MS SQL Server 2005 system and want to upgrade to MS SQL Server 2008. This is what I tried on some test machines: 1. copied the \DATA directory from the source (MSSQL 2005) to exactly the same path on the target (MSSQL 2008) server. 2. edited the startup parameters on the MSSQL 2008 service to point to the path of the MSSQL 2005 master database. 3. restarted MSSQL service It worked and I can access all databases, tables and data. My questions are: I go back to SQL Server 4.2 and it has never been this easy. I know it worked, but should have it worked? Am I missing something, or is there going to be a gotcha next week? These are simple databases, with just tables, views and indexes. No cross database links, no triggers etc

    Read the article

  • SQL SERVER – Simple Demo of New Cardinality Estimation Features of SQL Server 2014

    - by Pinal Dave
    SQL Server 2014 has new cardinality estimation logic/algorithm. The cardinality estimation logic is responsible for quality of query plans and majorly responsible for improving performance for any query. This logic was not updated for quite a while, but in the latest version of SQL Server 2104 this logic is re-designed. The new logic now incorporates various assumptions and algorithms of OLTP and warehousing workload. Cardinality estimates are a prediction of the number of rows in the query result. The query optimizer uses these estimates to choose a plan for executing the query. The quality of the query plan has a direct impact on improving query performance. ~ Souce MSDN Let us see a quick example of how cardinality improves performance for a query. I will be using the AdventureWorks database for my example. Before we start with this demonstration, remember that even though you have SQL Server 2014 to see the effect of new cardinality estimates, you will need your database compatibility mode set to 120 which is for SQL Server 2014. If your server instance of SQL Server 2014 but you have set up your database compatibility mode to 110 or any other earlier version, you will get performance from your query like older version of SQL Server. Now we will execute following query in two different compatibility mode and see its performance. (Note that my SQL Server instance is of version 2014). USE AdventureWorks2014 GO -- ------------------------------- -- NEW Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 120 GO EXEC [dbo].[uspGetManagerEmployees] 44 GO -- ------------------------------- -- Old Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 110 GO EXEC [dbo].[uspGetManagerEmployees] 44 GO Result of Statistics IO Compatibility level 120 Table ‘Person’. Scan count 0, logical reads 6, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Employee’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Compatibility level 110 Table ‘Worktable’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Person’. Scan count 0, logical reads 137, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Employee’. Scan count 2, logical reads 7, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table ‘Worktable’. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. You will notice in the case of compatibility level 110 there 137 logical read from table person where as in the case of compatibility level 120 there are only 6 physical reads from table person. This drastically improves the performance of the query. If we enable execution plan, we can see the same as well. I hope you will find this quick example helpful. You can read more about this in my latest Pluralsight Course. Reference: Pinal Dave (http://blog.SQLAuthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Firefox 3.6.3 on Snow Leopard 10.6.3 - symbolic link to command line binary doesn't work?

    - by David Watson
    I have Firefox 10.6.3 installed on Mac OS X Snow Leopard from the DMG. I can run firefox from the terminal using /Applications/Firefox.app/Contents/MacOS/firefox-bin. However, if I create a symbolic link: sudo ln -s /Applications/Firefox.app/Contents/MacOS/firefox-bin /bin/firefox then it refuses to run, or at least display. When I issue "firefox" from the terminal, I can see the process in top, but never get the GUI to appear. :/ = ls -lr /bin/firefox lrwxr-xr-x 1 root wheel 52 May 5 15:19 /bin/firefox - /Applications/Firefox.app/Contents/MacOS/firefox-bin Any ideas? Thanks, David

    Read the article

  • Cross-Browser Extension Installation now Possible with Opera and Google Chrome

    - by Akemi Iwaya
    People have been curious if there would be cross-browser compatibility for extensions due to Opera’s recent switch to the browser engine that Google Chrome uses. That question has now been answered. The OMG! Chrome! Blog has put together a nice tutorial on how to get cross-browser extension compatibility set up and working with your browser of choice. Screenshot courtesy of OMG! Chrome! Blog. While it is not surprising that the first steps in cross-browser extension compatibility have been taken, it will be interesting to see how it develops as the process is refined and further development occurs with the ‘new’ Opera. What are your thoughts on this? Is cross-browser extension compatibility really that important? Perhaps you feel that it does not matter? Let us know your thoughts in the comments!    

    Read the article

  • Oracle Tutor: *** CAUTION to Word .docx Users ***

    - by [email protected]
    Microsoft released a security update KB969604 for Office 2007 (around June 2009) This update causes document variables within Word docx files to be scrambled. This update might still be pushed out via Office 2007 updates DO NOT save files as docx using MS OFFICE 2007 until you apply the MS hotfix # 970942 available here If you are using Windows XP with Office 2003 or Office 2000 and have installed an older Office 2007 compatibility pack, documents saved as docx may also cause the scrambled document variables. Installing the 2007 compatibility pack published on 1/6/2010 (version 4) will prevent the document variables from becoming corrupt. Those on Windows 2000 may not be able to install the latest compatibility pack, or the compatibility pack may not function properly. This situation will hopefully be rectified in the coming months. What is a document variable? Document variables store data inside the document, invisible to the user. The Tutor software uses them when converting the document to HTML and when creating the flowchart, just to name a couple of uses. How will you know if a document's variables are scrambled? The difficulty in diagnosing the issue is that the symptoms can take myriad forms. There isn't a single error message or a single feature that one can point to and say, "test for the problem by doing this." The best clue about the error is seeing any kind of string in an error message that has garbage characters, question marks, xml code snippets, or just nonsense. Such as "Language ?????????????xlr;lwlerkjl could not be found." It is also possible to see the corrupted data in the footers of the Word docs. And, just because the footers look correct does not mean that the document variables are not corrupted. The corruption problem does not occur in every document variable in the document, just some of them. Often it is less than a quarter of them. What is the difference between docx files and doc files? Office 2007 uses Office Open XML formats with .docx and .docm filename extensions. - Docx is an Office Open XML word document. - Docm is a macro enabled Office Open XML document. This means the file structure behind the scenes is quite different from the binary file formats used prior to Office 2007 such as .doc, .dot, .xls, and .ppt. Solution Summary: For Windows XP and Word 2007: Install the hotfix, or save files as *.doc For Windows XP and Word 2000 and 2003: Install the latest compatibility pack or save files as *.doc For Windows 2000 with Word 2000 or 2003, do not use any compatibility pack, save files as *.doc Emily Chorba Principle Product Manager for Oracle Tutor

    Read the article

  • Where should my application setup put the binary executables in Windows 7?

    - by KeyboardMonkey
    I created a small Windows app, and am builder a setup for it using NSIS, but what I can't find out is where to put the executables to conform to the new Windows security model. Traditionally we put program files in, well, "c:\program files". With the security model getting more mangled with each Windows version, some users have restricted accounts, and I'm not sure installing into program files will work for these users. Where can I install my program's files that will cater for these lower-privileged users? Oh and I want to avoid ClickOnce.

    Read the article

  • How do operating systems… run… without having an OS to run in?

    - by Plazmotech Binary
    I'm really curious right now. I'm a Python programmer, and this question just boggled me: You write an OS. How do you run it? It has to be run somehow, and that way is within another OS? How can an application run without being in an OS? How do you tell the computer to run, say, C, and execute these commands to the screen, if it doesn't have an OS to run in? Does it have to do with a UNIX kernel? If so, what is a Unix kernel, or a kernel in general? I'm sure OSes are more complicated than that, but how does it work?

    Read the article

  • Any pre-rolled System.IO abstraction libraries out there for Unit Testing?

    - by Binary Worrier
    To test methods that use the file system we need to basically put System.IO behind a set of interfaces that we can then mock, I do this with a DiskIO class and interface. As my DiskIO code gets larger (and the grumblings from the we're unconvinced about this TDD thing crowd here in work get louder), I went looking for a comprehensive open source library that already does this and found . . . nothing. I may be looking in the wrong place or have approached this problem in completely the wrong way. I can't be the only idiot in this position, do these libraries exist, if so where are they? Any you've used and would recommend? Thanks P.S. I'm happy with my current approach i.e. starting with what we need, and adding only when the need arises. Unfortunately the we're unconvinced about this TDD thing crowd remain unconvinced, and think that I can't be right.

    Read the article

  • Type of AI to tackle this problem?

    - by user1154277
    I posted this on stackoverflow but want to get your recommendations as well as a user on overflow recommended I post it here. I'm going to say from the beginning that I am not a programmer, I have a cursory knowledge of different types of AI and am just a businessman building a web app. Anyways, the web app I am investing in to develop is for a hobby of mine. There are many part manufacturers, product manufacturers, upgrade and addon manufacturers etc. for hardware/products in this hobby's industry. Currently, I am in the process of building a crowd sourced platform for people who are knowledgeable to go in and mark up compatibility between those parts as its not always clear cut if they are for example: Manufacturer A makes a "A" class product, and manufacturer B makes upgrade/part that generally goes with class "A" products, but is for one reason or another not compatible with Manufacturer A's particular "A" class product. However, a good chunk (60%-70%) of the products/parts in the database can have their compatibility inferenced by their properties, For example: Part 1 is type "A" with "X" mm receiver and part 2 is also Type "A" with "X" mm interface and thus the two parts are compatible.. or Part 1 is a 8mm gear, thus all bushings of 8mm from any manufacturer is compatible with part 1. Further more, all gears can only have compatibility relationships in the database with bushing and gear boxes, but there can be no meaningful compatibility between a gear and a rail, or receiver since those parts don't interface. Now what I want is an AI to be able to learn from the decisions of the crowdsourced platform community and be able to inference compatibility for new parts/products based on their tagged attributes, what type of part they are etc. What would be the best form of AI to tackle this? I was thinking a Expert System, but explicitly engineering all of the knowledge rules would be daunting because of the complex relations between literally tens of thousands of parts, hundreds of part types and many manufacturers. Would a ANN (neural network) be ideal to learn from the many inputs/decisions of the crowdsource platform users? Any help/input is much appreciated.

    Read the article

  • How to operating systems… run… without having an OS to run in?

    - by Plazmotech Binary
    I'm really curious right now. I'm a Python programmer, and this question just boggled me: You write an OS. How do you run it? It has to be run somehow, and that way is within another OS? How can an application run without being in an OS? How do you tell the computer to run, say, C, and execute these commands to the screen, if it doesn't have an OS to run in? Does it have to do with a UNIX kernel? If so, what is a unix kernel, or a kernel in general? I'm sure OSes are more complicated than that, but how does it work? It would be really brilliant to know this! Thanks.

    Read the article

  • backup util for binary/media files. (to use with source control)

    - by acidzombie24
    I am using git for my source control. I dont backup media such as gifs, pngs, etc. I am thinking everytime i tag a release it would be a good idea to backup the media files as well. But i dont want to make several copies of the same file each time i create a tag. I'd like an app to handle checking if the file already exists and handles restoring everything to a version i like What util might i use to do this? I'm using windows 7.

    Read the article

  • What is the best Binary Decision Diagram library for Java?

    - by reprogrammer
    A Binary Decision Diagram (BDD) is a data structure to represent boolean functions. I'd like use this data structure in a Java program. My search for Java based BDD libraries resulted into the following packages. Java Decision Diagram Libraries JavaBDD JDD JBDD bddbddb If you know of any other BDD libraries available for Java programs, please let me know so that I add it to the list above. If you have used any of these libraries, please tell me about your experience with the library. In particular, I'd like you to compare the available libraries along the following dimensions. Quality. Is the library mature and reasonably bug free? Performance. How do you evaluate the performance of the library? Support. Could you easily get support whenever you encountered a problem with the library? Was the library well documented? Ease of use. Was the API well designed? Could you install and use the library quickly and easily? Please mention the version of the library that you are evaluating.

    Read the article

  • Getting message in android app: Binary XML file line #2: You must supply a layout_width attribute.

    - by opike
    I'm trying to use a ListView inside of a RelativeLayout but when I run my app I get a runtimeexception with the message: Binary XML file line #2: You must supply a layout_width attribute. I tried putting layout_width attributes in every conceivable place in the xml resource files but so far no luck. I attempt to populate the listview with this line of code: setListAdapter(new ArrayAdapter(this, R.layout.tablerow3, R.id.label, items)); Here's the tablerow3.xml contents: <?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"> android:layout_width="20dp" android:layout_height="5dp" android:id="@+id/tablerow01"> <Label android:id="@+id/label01" android:layout_width="5dp" android:layout_height="5dp" android:textColor="@color/solid_white" android:singleLine="true"/> <Label android:id="@+id/label02" android:layout_width="5dp" android:layout_height="5dp" android:textColor="@color/solid_white" android:singleLine="true"/> </LinearLayout> Here's the xml that contains the RelativeLayout(forex2.xml): <?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent"> <Button android:text="Static Button" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignParentLeft="true" android:layout_alignParentTop="true" android:id="@+id/button_id"> </Button> <Spinner android:id="@+id/spinner1" android:layout_width="match_parent" android:layout_toRightOf="@id/button_id" android:layout_alignParentTop="true" android:layout_height="wrap_content" android:drawSelectorOnTop="true" /> <ListView android:id="@android:id/list" android:layout_width="5dp" android:layout_height="5dp" android:layout_alignParentBottom="true" /> <!-- android:layout_width="wrap_content" android:layout_height="wrap_content" --> </RelativeLayout>

    Read the article

  • Java InputReader. Detect if file being read is binary?

    - by Trizicus
    I had posted a question in regards to this code. I found that JTextArea does not support the binary type data that is loaded. So my new question is how can I go about detecting the 'bad' file and canceling the file I/O and telling the user that they need to select a new file? class Open extends SwingWorker<Void, String> { File file; JTextArea jta; Open(File file, JTextArea jta) { this.file = file; this.jta = jta; } @Override protected Void doInBackground() throws Exception { BufferedReader br = null; try { br = new BufferedReader(new FileReader(file)); String line = br.readLine(); while(line != null) { publish(line); line = br.readLine(); } } finally { try { br.close(); } catch (IOException e) { } } return null; } @Override protected void process(List<String> chunks) { for(String s : chunks) jta.append(s + "\n"); } }

    Read the article

  • For business people to manage, keep binary images in MySQL or just the urls?

    - by Michael Mao
    Hello everyone: I am working on a task to enable image uploading and auto-scaling(from full sized to thumbnail) by jQuery & PHP. I can naturally come up with two approaches : First, store both images as binary objects directly into MySQL; Second, store only urls to the images and keep the images somewhere on server. The images are for everyone to view, so there are no security restrictions, as far as I know. Personally I don't have any preference, however, at the end of the day, it is the business people that are going to manage the images as part of the system(CRUD). So I am wondering which seems to be a bit better for them? Of course I am building a easy-to-use, visualize web interface for the staff to control the process, but I am not sure if that is enough. Lessons told me that if I don't think for the future and seek the most flexible approach, the I will probably screw myself sooner or later. PS. The following link is what I've found so far, which is pretty cool, no flash involved :) Andrew Valum's ajax image upload jQuery plugin

    Read the article

  • SSIS String or binary data would be truncated. The statement has been terminated.

    - by Subbarao
    When I run SSIS package from BIDS it runs fine without any error / problem. When I try to call it through a ASP.NET website I get the following error - "String or binary data would be truncated. The statement has been terminated." I checked all the columns / data to see if anything is exceeding the limit, everything is fine. I can run the package through command line using dtexec C:dtexec /f "C:\temp\MyTempPackage.dtsx", it executes without any problem . The problem is when I try to run it through ASP.NET. The following is the code that I am trying to use - //DTS Runtime Application Application app = new Application(); //DTS Package Package package = app.LoadPackage(packagePath, null); //Execute and Get the result DTSExecResult result = package.Execute(); I am making a call to a webservice from asp.net which has the above code. Both the webservice and website have identity impersonation enabled. I have identity enabled in my web.config for this <identity impersonate="true" userName="MyUserName" password="MyPassword"/> This problem is only when I am trying to import a Excel file (.xlsx) when I import a .txt file everything is fine. Excel Import blew up in both 32bit and 64bit enviornments. Help on how to make this to work is greatly appreciated.

    Read the article

  • How to find sum of node's value for given depth in binary tree?

    - by masato-san
    I've been scratching my head for several hours for this... problem: Binary Tree (0) depth 0 / \ 10 20 depth 1 / \ / \ 30 40 50 60 depth 2 I am trying to write a function that takes depth as argument and return the sum of values of nodes of the given depth. For instance, if I pass 2, it should return 180 (i.e. 30+40+50+60) I decided to use breath first search and when I find the node with desired depth, sum up the value, but I just can't figure out how to find out the way which node is in what depth. But with this approach I feel like going to totally wrong direction. function level_order($root, $targetDepth) { $q = new Queue(); $q->enqueue($root); while(!$q->isEmpty) { //how to determin the depth of the node??? $node = $q->dequeue(); if($currentDepth == $targetDepth) { $sum = $node->value; } if($node->left != null) { $q->enqueue($node->left); } if($node->right != null) { $q->enqueue($node->right); } //need to reset this somehow $currentDepth ++; } }

    Read the article

  • How to load a binary file(.bin) of size 6 MB in a varbinary(MAX) column of SQL Server 2005 database

    - by Feroz Khan
    How to load a binary file(.bin) of size 6 MB in a varbinary(MAX) column of SQL Server 2005 database using ADO in vc++ application. This is the code I am using to load the file which I used to load a .bmp file BOOL CSaveView::PutECGInDB(CString strFilePath, FieldPtr pFileData) { //Open File CFile fileImage; CFileStatus fileStatus; fileImage.Open(strFilePath,CFile::modeRead); fileImage.GetStatus(fileStatus); //Alocating memory for data ULONG nBytes = (ULONG)fileStatus.m_size; HGLOBAL hGlobal = GlobalAlloc(GPTR,nBytes); LPVOID lpData = GlobalLock(hGlobal); //Putting data into file fileImage.Read(lpData,nBytes); HRESULT hr; _variant_t varChunk; long lngOffset = 0; UCHAR chData; SAFEARRAY FAR *psa = NULL; SAFEARRAYBOUND rgsabound[1]; try { //Create a safe array to store the BYTES rgsabound[0].lLbound = 0; rgsabound[0].cElements = nBytes; psa = SafeArrayCreate(VT_UI1,1,rgsabound); while(lngOffset<(long)nBytes) { chData = ((UCHAR*)lpData)[lngOffset]; hr = SafeArrayPutElement(psa,&lngOffset,&chData); if(hr != S_OK) { return false; } lngOffset++; } lngOffset = 0; //Assign the safe array to a varient varChunk.vt = VT_ARRAY|VT_UI1; varChunk.parray = psa; hr = pFileData-AppendChunk(varChunk); if(hr != S_OK) { return false; } } catch(_com_error &e) { //get info from _com_error _bstr_t bstrSource(e.Source()); _bstr_t bstrDescription(e.Description()); _bstr_t bstrErrorMessage(e.ErrorMessage()); _bstr_t bstrErrorCode(e.Error()); TRACE("Exception thrown for classes generated by #import"); TRACE("\tCode= %08lx\n",(LPCSTR)bstrErrorCode); TRACE("\tCode Meaning = %s\n",(LPCSTR)bstrErrorMessage); TRACE("\tSource = %s\n",(LPCSTR)bstrSource); TRACE("\tDescription = %s\n",(LPCSTR)bstrDescription); } catch(...) { TRACE("Unhandle Exception"); } //Free Memory GlobalUnlock(lpData); return true; } But when I read the same file using Getchunk funcion it gives me all 0,s but the size of the file I get is same as the one uploaded. Your help will be highly appreciated. Thanks in advance.

    Read the article

  • How can I tell groovy/grails not to try to "re-encode" binary data? (Revised title)

    - by ?????
    I have a groovy/grails application that needs to serve images It works fine on my dev box, the image is returned properly. Here's the start of the returned JPEG, as seen by od -cx 0000000 377 330 377 340 \0 020 J F I F \0 001 001 001 001 , d8ff e0ff 1000 464a 4649 0100 0101 2c01 but on the production box, there's some garbage in front, and the d8ff e0ff before the 1000 is missing 0000000 ? ** ** ? ** ** ? ** ** ? ** ** \0 020 J F bfef efbd bdbf bfef efbd bdbf 1000 464a 0000020 I F \0 001 001 001 \0 H \0 H \0 \0 ? ** ** ? 4649 0100 0101 4800 4800 0000 bfef efbd It's the exact same code. I just moved the .war over and run it on a different machine. (Isn't Java supposed to be write once, run everywhere?) Any ideas? An "encoding" problem? The code is sent to the response like this: response.contentType = "image/jpeg"; response.outputStream << out; Here's the code that locates the image on an internal application server and re-serves the image. I've pared down the code a bit to remove the error handling, etc, to make it easier to read. def show = { def address = "http://internal.application.server:9899/img?photoid=${params.id}" def out = new ByteArrayOutputStream() out << new URL(address).openStream() response.contentLength = out.size(); // XXX If you don't do this hack, "head" requests won't work! if (request.method == 'HEAD') { render( text : "", contentType : "image/jpeg" ); } else { response.contentType = "image/jpeg"; response.outputStream << out; } } Update: I tried setting the CharacterEncoding response.setCharacterEncoding("ISO-8859-1"); if (request.method == 'HEAD') { render( text : "", contentType : "image/jpeg" ); } else { response.contentType = "image/jpeg;charset=ISO-8859-1"; response.outputStream << out; } but it made no difference in the output. On my production machine, the binary bytes in the image are re-encoded/escaped as if they were UTF-8 (see Michael's explanation below). It works fine on my development machine.

    Read the article

  • Streaming binary data to WCF rest service gives Bad Request (400) when content length is greater than 64k

    - by Mikey Cee
    I have a WCF service that takes a stream: [ServiceContract] public class UploadService : BaseService { [OperationContract] [WebInvoke(BodyStyle=WebMessageBodyStyle.Bare, Method=WebRequestMethods.Http.Post)] public void Upload(Stream data) { // etc. } } This method is to allow my Silverlight application to upload large binary files, the easiest way being to craft the HTTP request by hand from the client. Here is the code in the Silverlight client that does this: const int contentLength = 64 * 1024; // 64 Kb var request = (HttpWebRequest)WebRequest.Create("http://localhost:8732/UploadService/"); request.AllowWriteStreamBuffering = false; request.Method = WebRequestMethods.Http.Post; request.ContentType = "application/octet-stream"; request.ContentLength = contentLength; using (var outputStream = request.GetRequestStream()) { outputStream.Write(new byte[contentLength], 0, contentLength); outputStream.Flush(); using (var response = request.GetResponse()); } Now, in the case above, where I am streaming 64 kB of data (or less), this works OK and if I set a breakpoint in my WCF method, and I can examine the stream and see 64 kB worth of zeros - yay! The problem arises if I send anything more than 64 kB of data, for instance by changing the first line of my client code to the following: const int contentLength = 64 * 1024 + 1; // 64 kB + 1 B This now throws an exception when I call request.GetResponse(): The remote server returned an error: (400) Bad Request. In my WCF configuration I have set maxReceivedMessageSize, maxBufferSize and maxBufferPoolSize to 2147483647, but to no avail. Here are the relevant sections from my service's app.config: <service name="UploadService"> <endpoint address="" binding="webHttpBinding" bindingName="StreamedRequestWebBinding" contract="UploadService" behaviorConfiguration="webBehavior"> <identity> <dns value="localhost" /> </identity> </endpoint> <host> <baseAddresses> <add baseAddress="http://localhost:8732/UploadService/" /> </baseAddresses> </host> </service> <bindings> <webHttpBinding> <binding name="StreamedRequestWebBinding" bypassProxyOnLocal="true" useDefaultWebProxy="false" hostNameComparisonMode="WeakWildcard" sendTimeout="00:05:00" openTimeout="00:05:00" receiveTimeout="00:05:00" maxReceivedMessageSize="2147483647" maxBufferSize="2147483647" maxBufferPoolSize="2147483647" transferMode="StreamedRequest"> <readerQuotas maxArrayLength="2147483647" maxStringContentLength="2147483647" /> </binding> </webHttpBinding> </bindings> <behaviors> <endpointBehaviors> <behavior name="webBehavior"> <webHttp /> </behavior> <endpointBehaviors> </behaviors> How do I make my service accept more than 64 kB of streamed post data?

    Read the article

  • Load a 6 MB binary file in a SQL Server 2005 VARBINARY(MAX) column using ADO/VC++?

    - by Feroz Khan
    How to load a binary file(.bin) of size 6 MB in a varbinary(MAX) column of SQL Server 2005 database using ADO in a VC++ application. This is the code I am using to load the file which I used to load a .bmp file: BOOL CSaveView::PutECGInDB(CString strFilePath, FieldPtr pFileData) { //Open File CFile fileImage; CFileStatus fileStatus; fileImage.Open(strFilePath,CFile::modeRead); fileImage.GetStatus(fileStatus); //Alocating memory for data ULONG nBytes = (ULONG)fileStatus.m_size; HGLOBAL hGlobal = GlobalAlloc(GPTR,nBytes); LPVOID lpData = GlobalLock(hGlobal); //Putting data into file fileImage.Read(lpData,nBytes); HRESULT hr; _variant_t varChunk; long lngOffset = 0; UCHAR chData; SAFEARRAY FAR *psa = NULL; SAFEARRAYBOUND rgsabound[1]; try { //Create a safe array to store the BYTES rgsabound[0].lLbound = 0; rgsabound[0].cElements = nBytes; psa = SafeArrayCreate(VT_UI1,1,rgsabound); while(lngOffset<(long)nBytes) { chData = ((UCHAR*)lpData)[lngOffset]; hr = SafeArrayPutElement(psa,&lngOffset,&chData); if(hr != S_OK) { return false; } lngOffset++; } lngOffset = 0; //Assign the safe array to a varient varChunk.vt = VT_ARRAY|VT_UI1; varChunk.parray = psa; hr = pFileData->AppendChunk(varChunk); if(hr != S_OK) { return false; } } catch(_com_error &e) { //get info from _com_error _bstr_t bstrSource(e.Source()); _bstr_t bstrDescription(e.Description()); _bstr_t bstrErrorMessage(e.ErrorMessage()); _bstr_t bstrErrorCode(e.Error()); TRACE("Exception thrown for classes generated by #import"); TRACE("\tCode= %08lx\n",(LPCSTR)bstrErrorCode); TRACE("\tCode Meaning = %s\n",(LPCSTR)bstrErrorMessage); TRACE("\tSource = %s\n",(LPCSTR)bstrSource); TRACE("\tDescription = %s\n",(LPCSTR)bstrDescription); } catch(...) { TRACE("***Unhandle Exception***"); } //Free Memory GlobalUnlock(lpData); return true; } But when I read the same file using Getchunk function it gives me all 0s but the size of the file I get is same as the one uploaded. Your help will be highly appreciated.

    Read the article

  • How do I develop browser plugins with cross-platform and cross-browser compatibility in mind?

    - by Schnapple
    My company currently has a product which relies on a custom, in-house ActiveX control. The technology it employs (TWAIN) is itself cross-platform by design, but our solution is obviously limited to Internet Explorer on Windows. Long term we would like to become cross-browser and cross-platform (i.e., support other browsers on Windows, support the Macintosh or Linux). Obviously if we wanted to support Firefox on Windows I would need to write a plugin for it. But if we wanted to support the Macintosh, how do I attack that? Is it possible to compile a version of the Firefox plugin that runs on the Mac? Would I be remiss to not also support Safari on the Mac? Are there any plugins which are cross-browser on a platform? (i.e., can any browsers run plugins for other browsers) Since TWAIN is so low-level to the operating system, I do not think Java would be a solution in any capacity, but I could be wrong. What do people generally do when they want to support multiple platforms with a process that will need to be cross-platform and cross-browser compatible?

    Read the article

  • Jquery Internet Explorer 8 compatibility issue, does not load data unless history is deleted...

    - by Scarface
    Hey guys, I have a weird problem. I have an update system that refreshes data on a time interval. It works well in all browsers except internet explorer 8. The problem is that once it loads the data, it does not matter if the data updates further, it will not update the data visually until the internet history is cleared. I am not using any cookies server-side...Anyone ever encounter something like this? Here is my javascript, thanks for any assistance in advance function prepare(response) { var d = new Date(); count++; d.setTime(response.time*1000); var mytime = d.getHours()+':'+d.getMinutes()+':'+d.getSeconds(); var string = '<li class="shoutbox-list" id="list-'+count+'">' + '<span class="shoutbox-list-nick"><a href="statistics.php?user='+response.user+'">'+response.user+'</a></span>' + ' <span class="date">'+mytime+'</span><br>' + '<span class="msg">'+response.message+'</span>' +'</li>'; return string; } function refresh() { $.getJSON(files+"shoutbox.php?action=view&time="+lastTime+"&topic_id="+topic_id, function(json) { if(json.length) { for(i=0; i < json.length; i++) { $('#daddy-shoutbox-list').prepend(prepare(json[i])); $('#list-' + count).fadeIn(1500); } var j = i-1; lastTime = json[j].time; } //alert(lastTime); }); timeoutID = setTimeout(refresh, 3000); } $(document).ready(function() { var options = { dataType: 'json', beforeSubmit: validate, success: function(response, status){ if (response.error=='success'){ success(response, status); } else { $.prompt(response.error); } } }; $('#daddy-shoutbox-form').ajaxForm(options); timeoutID = setTimeout(refresh, 100); });

    Read the article

  • Providing multi-version databases for backward compatibility for production applications/databases.

    - by JavaRocky
    How can I manage multiple versions of a database easily? I have some data (as views as selects for data originating in tables from other schemas), which other database may reference using various means including database synonyms & links. I wish to provide a sort of interface/guarantee in-case future for applications/databases which use this data. All of this is for in the event i need to update the views for correctness or applicability inside my database. How can i achieve this in a maintained, controlled and easy way? I am using Oracle 10g if that matters.

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >