Search Results

Search found 60932 results on 2438 pages for 'data operations'.

Page 444/2438 | < Previous Page | 440 441 442 443 444 445 446 447 448 449 450 451  | Next Page >

  • SharePoint WSS 3.0 - Fastest approach to build data-bound forms?

    - by Rusty Scupper
    Pardon the vague question, but I've just inherited a project to build a couple dozen forms pulling data from a SQL 2005 database. The forms are mostly standard database lookups with just a couple updates so the data layer is very simple -- I'm the DBA -- but we just recently started using SharePoint WSS 3.0 for a departmental web site and I would prefer to integrate these forms into WSS rather than build a separate standalone ASP.NET app. I'm hoping there is some RAD framework out there that integrates with WSS for data bound forms. I've been searching online, but haven't found anything very promising and the data viewer web part in SharePoint Designer looks complicated to integrate with an external SQL database. TIA, Rusty

    Read the article

  • Autocorrelation method for pitch determination.. whats d input data form..?

    - by harsh
    i hav read a code for pitch determination using autocorrelation method. can anybody please tell wht wud b d input data(passed as argument to DetectPitch()) function here: double DetectPitch(short* data) { int sampleRate = 2048; //Create sine wave double *buffer = malloc(1024*sizeof(short)); double amplitude = 0.25 * 32768; //0.25 * max length of short double frequency = 726.0; for (int n = 0; n < 1024; n++) { buffer[n] = (short)(amplitude * sin((2 * 3.14159265 * n * frequency) / sampleRate)); } doHighPassFilter(data); printf("Pitch from sine wave: %f\n",detectPitchCalculation(buffer, 50.0, 1000.0, 1, 1)); printf("Pitch from mic: %f\n",detectPitchCalculation(data, 50.0, 1000.0, 1, 1)); return 0; }

    Read the article

  • How to use data receive event in Socket class?

    - by affan
    I have wrote a simple client that use TcpClient in dotnet to communicate. In order to wait for data messages from server i use a Read() thread that use blocking Read() call on socket. When i receive something i have to generate various events. These event occur in the worker thread and thus you cannot update a UI from it directly. Invoke() can be use but for end developer its difficult as my SDK would be use by users who may not use UI at all or use Presentation Framework. Presentation framework have different way of handling this. Invoke() on our test app as Microstation Addin take a lot of time at the moment. Microstation is single threaded application and call invoke on its thread is not good as it is always busy doing drawing and other stuff message take too long to process. I want my events to generate in same thread as UI so user donot have to go through the Dispatcher or Invoke. Now i want to know how can i be notified by socket when data arrive? Is there a build in callback for that. I like winsock style receive event without use of separate read thread. I also do not want to use window timer to for polling for data. I found IOControlCode.AsyncIO flag in IOControl() function which help says Enable notification for when data is waiting to be received. This value is equal to the Winsock 2 FIOASYNC constant. I could not found any example on how to use it to get notification. If i am write in MFC/Winsock we have to create a window of size(0,0) which was just used for listening for the data receive event or other socket events. But i don't know how to do that in dotnet application.

    Read the article

  • Copying 6000 tables and data from sqlserver to oracle ==> fastest method?

    - by nazer555
    i need to copy the tables and data (about 5 yrs data, 6200 tables) stored in sqlserver, i am using datastage and odbc connection to connect and datstage automatically creates the table with data, but its taking 2-3 hours per table as tables are very large(0.5 gig, 300+columns and about 400k rows). How can i achieve this the fastes as at this rate i am able to only copy 5 tables per day but within 30 days i need move over these 6000 tables.

    Read the article

  • Easiest way to convert json data into objects with methods attached?

    - by John Mee
    What's the quickest and easiest way to convert my json, containing the data of the objects, into actual objects with methods attached? By way of example, I get data for a fruitbowl with an array of fruit objects which in turn contain an array of seeds thus: {"fruitbowl": [{ "name": "apple", "color": "red", "seeds": [] },{ "name": "orange", "color": "orange", "seeds": [ {"size":"small","density":"hard"}, {"size":"small","density":"soft"} ]} } That's all nice and good but down on the client we do stuff with this fruit, like eat it and plant trees... var fruitbowl = [] function Fruit(name, color, seeds){ this.name = name this.color = color this.seeds = seeds this.eat = function(){ // munch munch } } function Seed(size, density){ this.size = size this.density = density this.plant = function(){ // grow grow } } My ajax's success routine currently is currently looping over the thing and constructing each object in turn and it doesn't handle the seeds yet, because before I go looping over seed constructors I'm thinking Is there not a better way? success: function(data){ fruitbowl.length = 0 $.each(data.fruitbowl, function(i, f){ fruitbowl.push(new Fruit(f.name, f.color, f.seeds)) }) I haven't explored looping over the objects as they are and attaching all the methods. Would that work?

    Read the article

  • Get data from aspx.cs page to aspx page.

    - by Brad8118
    So I am using a jquery plug in that allows me to change the order of things in a list by dragging and dropping them. So my goal is to be able to grab a list of my objects (AlertInfo) and using it in a javascript function. I was able to use a json webservice call in a test project to pass the data to the page. But we don't have a webservice page now so I tried to grab it from a aspx.cs page and it hasn't worked. ///Aspx page: $.ajax({ type: "POST", url: "~/Alerts/GetAlerts", data: "{}", contentType: "application/json; charset=utf-8", dataType: "json", success: function (msg) { var data = eval("(" + msg.d + ")"); jQuery.each(data, function (rec) { AlertList[AlertList.length] = new objAlert(this.id, this.title, this.details, JSONDateSerializationFix(this.startdate), JSONDateSerializationFix(this.enddate)); UpdateDisplayList(); }) }, error: function (msg) { alert("BRAD" + msg); } The issue is that the Alerts page in "URL /Alerts/GetAlerts" is Alerts.aspx.cs. I can't figure out if I can use this ajax command to call a method in a aspx.cs page. //Code behind page aspx.cs [WebMethod] //[ScriptMethod(ResponseFormat = ResponseFormat.Json)] public string GetAlerts() { List list = AlertInfo.GetTestAlerts(); return new JavaScriptSerializer().Serialize(list); } public List GetAlertsList() { List list = AlertInfo.GetTestAlerts(); return list; ; } So I was hoping that I could load data into an asp control (dataList) and then grab the data //code behind page protected void Page_Load(object sender, EventArgs e) { dataListAlertList.DataSource = GetAlertsList(); dataListAlertList.DataBind(); } public static List<AlertInfo> GetTestAlerts() { List<AlertInfo> list = new List<AlertInfo>(); list.Add(new AlertInfo("0", "Alert 1 Title", "Alert 1 Detail", "10/10/2010", "10/10/2011")); list.Add(new AlertInfo("1", "Alert 2 Title", "Alert 2 Detail", "10/10/2010", "10/10/2011")); return list; } //.aspx page $(document).ready(function () { var a1 = $("#dataListAlertList").val(); // do fun stuff now. } But I keep getting undefined....

    Read the article

  • Possibility of language data type not mapped to shipped .NET Framework?

    - by John K
    Does anybody know of a managed programming language implemented on .NET that contains a specialized data type that is not mapped through to the Common Type System/FCL/BCL or one that does not have a shipped .NET equivalent (e.g. shipped standard types like System.String, System.Int32)? This question would likely come from the perspective of someone porting a compiler (although I'm not doing that). Is it as simple as the language creating a new data type outside the BCL/FCL for its specialized type? If so does this hinder interoperability between programming languages that are otherwise accustomed to mapping all their built-in data types to what's in the BCL/FCL, like Visual Basic and C#? I can imagine this situation might come about if an obscure language compiler of some kind is ported to .NET for which there is no direct mapping of one of its implicit data types to the shipped Framework. How is this situation supported or allowed in general? What would be the expectation of the compiler and the Common Language Runtime?

    Read the article

  • How to copy data from another workbook and paste onto related group rows?

    - by leighla
    Hi there, How do I copy data from all the workbooks in the folder onto workbook 1 into it's corresponding row groups? The attached images shows the sample worksheet is the file I want to paste data into (main template) and wb2 sample is a sample of one of the worksheets in the folder that I want to copy data from. As you can see, the workbook 2 does not include all of the tasks. So I need to copy all of the data from workbook 2 and paste it on the corresponding row group (col A) on original workbook. I then need to do this for all workbooks in the folder. Any help would be most appreciated!

    Read the article

  • Single windows service to provide access to cached data?

    - by Matthias
    I need a solution where I have a single windows service providing access to cached data to various consumers: To an MVC web application, a .Net Assembly (COM interop) used within an classic ASP page, other windows services, a windows forms application. So the data must be accessible from various processes. The data being cached is read-only. For now, all processes are located on the same machine. The environment is .net framework 3.5 and c#. My question is, how can multiple appdomains/processes retrieve cached data from a single windows service?

    Read the article

  • How to properly load HTML data from third party website using MVC+AJAX?

    - by Dmitry
    I'm building ASP.NET MVC2 website that lets users store and analyze data about goods found on various online trade sites. When user is filling a form to create or edit an item, he should have a button "Import data" that automatically fills some fields based on data from third party website. The question is: what should this button do under the hood? I see at least 2 possible solutions. First. Do the import on client side using AJAX+jQuery load method. I tried it in IE8 and received browser warning popup about insecure script actions. Of course, it is completely unacceptable. Second. Add method ImportData(string URL) to ItemController class. It is called via AJAX, does the import + data processing server-side and returns JSON-d result to client. I tried it and received server exception (503) Server unavailable when loading HTML data into XMLDocument. Also I have a feeling that dealing with not well-formed HTML (missing closing tags, etc.) will be a huge pain. Any ideas how to parse such HTML documents?

    Read the article

  • Store data in file system rather than SQL or Oracle database.

    - by nunu
    Hi All, As I am working on Employee Management system, I have two table (for example) in database as given below. EmployeeMaster (DB table structure) EmployeeID (PK) | EmployeeName | City MonthMaster (DB table structure) Month | Year | EmployeeID (FK) | PrenentDays | BasicSalary Now my question is, I want to store data in file system rather than storing data in SQL or ORACLE. I want my data in file system storage for Insert, Edit and Delete opration with keeping relation with objects too. I am a C# developer, Could anybody have thoughts or idea on it. (To store data in file system with keeping relations between them) Thanks in advance. Any ideas on it?

    Read the article

  • How to notify table-data change to a client program?

    - by JMSA
    Suppose I have an application that access data resident in a central DB server and more than one user access data from client machines networked with the DB server. Suppose two client machines are running a copy of the application and two users are accessing the same DB table. How can I automatically refresh the data on the GUI, that is being viewed by one client, as soon as a change is made by another client in the DB table-data? Which technology should be used to solve this particular scenario in .net? WCF?

    Read the article

  • Oracle: Is there a way to get the column data types for a view?

    - by rally25rs
    For a table in oracle, I can query "all_tab_columns" and get table column information, like the data type, precision, whether or not the column is nullable. In SQL Developer or TOAD, you can click on a view in the GUI and it will spit out a list of the columns that the view returns and the same set of data (data type, precision, nullable, etc). So my question is, is there a way to query this column definition for a view, the way you can for a table? How do the GUI tools do it?

    Read the article

  • How to handle lifecycle of dynamically allocated data in Windows messages?

    - by nang
    Simple task: Send a windows message with dynamically allocated data, e.g. an arbitrary length string. How would you manage the responsibility to free this data? The receiver(s) of the windows message could be responsible to free this data. But: How can you guarantee that all messages will actually be received and thus the linked data will be freed? Imagine the situation that the receiver is shutting down, so it won't process it's message queue any more. However, the message queue still exists (for some time) and can still accept messages, which won't be processed any more. Thanks!

    Read the article

  • How to display data into datagridview using multi thread?

    - by Mark
    Hi, I have application where I read/receive data all the time (text) and I need to display this data into datagridview, what is the best way to do that in real time, so the data will be changed all the time. I thought about multi threading, if this is a good idea can you guide me with link to explain how to implement it. Thanks

    Read the article

  • Retrieving data from database. Retrieve only when needed or get everything?

    - by RHaguiuda
    I have a simple application to store Contacts. This application uses a simple relational database to store Contact information, like Name, Address and other data fields. While designing it, I question came to my mind: When designing programs that uses databases, should I retrieve all database records and store them in objects in my program, so I have a very fast performance or I should always gather data only when required? Of course, retrieving all data can only be done if it`s not too many, but do you use this approach when you make sure that the database will be small (< 300 records for example)? I have designed once a similar application that fetches data only when needed, but that was slow (using a Access database). Thanks for all help.

    Read the article

  • using spring, hibernate and scala, is there a better way to load test data than dbunit?

    - by egervari
    Here are some things I really dislike about dbunit: 1) You cannot specify the exact ordering the inserts because dbunit likes to group your inserts by table name, and not by the order you define them in the XML file. This is a problem when you have records depending on other records in other tables, so you have to disable foreign key constraints during your tests... which actually sucks because these foreign key constraints will get fired in production while your tests won't be aware of them! 2) They seem hellbent on forcing you to use an xml namespace to define your xml... and I honestly can't be bothered to do this. I like the data.xml without any namespace. It works. But they are so hellbent on deprecating it. 3) Creating different xml files is hard on a per test basis, so it actually encourages creating data for your entire app. Unfortunately, this process is a little bloated too once the data grows in size and things get inter tangled. There has got to be a better way to split up your test data into chunks without having to copy/paste a lot of the test data across all of your tests. 4) Keeping track of id references in a big xml file is just impossible. If you have 130 domain classes, it just gets bewildering. This model simply does not scale. Is there something less bloated and better in the Spring/Hibernate space? db unit has worn out its welcome and I'm really looking for something better.

    Read the article

  • Flex Tree with infinite parents and children

    - by Tempname
    I am working on a tree component and I am having a bit of the issue with populating the data-provider for this tree. The data that I get back from my database is a simple array of value objects. Each value object has 2 properties. ObjectID and ParentID. For parents the ParentID is null and for children the ParentID is the ObjectID of the parent. Any help with this is greatly appreciated. Essentially the tree should look something like this: Parent1 Child1 Child1 Child2 Child1 Child2 Parent2 Child1 Child2 Child3 Child1 This is the current code that I am testing with: public function setDataProvider(data:Array):void { var tree:Array = new Array(); for(var i:Number = 0; i < data.length; i++) { // do the top level array if(!data[i].parentID) { tree.push(data[i], getChildren(data[i].objectID, data)); } } function getChildren(objectID:Number, data:Array):Array { var childArr:Array = new Array(); for(var k:Number = 0; k < data.length; k++) { if(data[k].parentID == objectID) { childArr.push(data[k]); //getChildren(data[k].objectID, data); } } return childArr; } trace(ObjectUtil.toString(tree)); } Here is a cross section of my data: ObjectID ParentID 1 NULL 10 NULL 8 NULL 6 NULL 4 6 3 6 9 6 2 6 11 7 7 8 5 8

    Read the article

  • Does Compressed Sensing bring anything new to data Compression?

    - by anon
    Compressed sensing is great for situations where capturing data is expensive (either in energy or time), and thus less samples can now be taken. However, in situations like image compression, given that the data is already on the computer -- does compressed sensing offer anything? For example, would it offer better data compression? Would it result in better image search?... (Note: If you don't know what the field of Compressed Sensing is, please do not respond.)

    Read the article

  • What data structure would be the least painful DataTable replacement?

    - by MatthewMartin
    I'm storing a lot of sorted ~10 row 2 column/key value pairs in ASP.NET cache-- they're the data for dropdownlists. Right now they are all DataTables, which isn't very space efficient (the rule of thumb is 10x increase in size when data is strored in a dataset). Old Code DataTable table = dataAccess.GetDataTable(); dropDownList.DataSource = table; Hoped for new Code Unknown data = dataAccess.GetSomethingMoreSpaceEfficient(); dropDownList.DataSource = data; What pre-existing datastructures are similar enough to DataTable that would minimize code breakage and reduce the serialized size when stored in ASP.NET cache?

    Read the article

  • Combining FileStream and MemoryStream to avoid disk accesses/paging while receiving gigabytes of data?

    - by w128
    I'm receiving a file as a stream of byte[] data packets (total size isn't known in advance) that I need to store somewhere before processing it immediately after it's been received (I can't do the processing on the fly). Total received file size can vary from as small as 10 KB to over 4 GB. One option for storing the received data is to use a MemoryStream, i.e. a sequence of MemoryStream.Write(bufferReceived, 0, count) calls to store the received packets. This is very simple, but obviously will result in out of memory exception for large files. An alternative option is to use a FileStream, i.e. FileStream.Write(bufferReceived, 0, count). This way, no out of memory exceptions will occur, but what I'm unsure about is bad performance due to disk writes (which I don't want to occur as long as plenty of memory is still available) - I'd like to avoid disk access as much as possible, but I don't know of a way to control this. I did some testing and most of the time, there seems to be little performance difference between say 10 000 consecutive calls of MemoryStream.Write() vs FileStream.Write(), but a lot seems to depend on buffer size and the total amount of data in question (i.e the number of writes). Obviously, MemoryStream size reallocation is also a factor. Does it make sense to use a combination of MemoryStream and FileStream, i.e. write to memory stream by default, but once the total amount of data received is over e.g. 500 MB, write it to FileStream; then, read in chunks from both streams for processing the received data (first process 500 MB from the MemoryStream, dispose it, then read from FileStream)? Another solution is to use a custom memory stream implementation that doesn't require continuous address space for internal array allocation (i.e. a linked list of memory streams); this way, at least on 64-bit environments, out of memory exceptions should no longer be an issue. Con: extra work, more room for mistakes. So how do FileStream vs MemoryStream read/writes behave in terms of disk access and memory caching, i.e. data size/performance balance. I would expect that as long as enough RAM is available, FileStream would internally read/write from memory (cache) anyway, and virtual memory would take care of the rest. But I don't know how often FileStream will explicitly access a disk when being written to. Any help would be appreciated.

    Read the article

< Previous Page | 440 441 442 443 444 445 446 447 448 449 450 451  | Next Page >