Search Results

Search found 108959 results on 4359 pages for 'ado net data services'.

Page 127/4359 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • Using Core Data Concurrently and Reliably

    - by John Topley
    I'm building my first iOS app, which in theory should be pretty straightforward but I'm having difficulty making it sufficiently bulletproof for me to feel confident submitting it to the App Store. Briefly, the main screen has a table view, upon selecting a row it segues to another table view that displays information relevant for the selected row in a master-detail fashion. The underlying data is retrieved as JSON data from a web service once a day and then cached in a Core Data store. The data previous to that day is deleted to stop the SQLite database file from growing indefinitely. All data persistence operations are performed using Core Data, with an NSFetchedResultsController underpinning the detail table view. The problem I am seeing is that if you switch quickly between the master and detail screens several times whilst fresh data is being retrieved, parsed and saved, the app freezes or crashes completely. There seems to be some sort of race condition, maybe due to Core Data importing data in the background whilst the main thread is trying to perform a fetch, but I'm speculating. I've had trouble capturing any meaningful crash information, usually it's a SIGSEGV deep in the Core Data stack. The table below shows the actual order of events that happen when the detail table view controller is loaded: Main Thread Background Thread viewDidLoad Get JSON data (using AFNetworking) Create child NSManagedObjectContext (MOC) Parse JSON data Insert managed objects in child MOC Save child MOC Post import completion notification Receive import completion notification Save parent MOC Perform fetch and reload table view Delete old managed objects in child MOC Save child MOC Post deletion completion notification Receive deletion completion notification Save parent MOC Once the AFNetworking completion block is triggered when the JSON data has arrived, a nested NSManagedObjectContext is created and passed to an "importer" object that parses the JSON data and saves the objects to the Core Data store. The importer executes using the new performBlock method introduced in iOS 5: NSManagedObjectContext *child = [[NSManagedObjectContext alloc] initWithConcurrencyType:NSPrivateQueueConcurrencyType]; [child setParentContext:self.managedObjectContext]; [child performBlock:^{ // Create importer instance, passing it the child MOC... }]; The importer object observes its own MOC's NSManagedObjectContextDidSaveNotification and then posts its own notification which is observed by the detail table view controller. When this notification is posted the table view controller performs a save on its own (parent) MOC. I use the same basic pattern with a "deleter" object for deleting the old data after the new data for the day has been imported. This occurs asynchronously after the new data has been fetched by the fetched results controller and the detail table view has been reloaded. One thing I am not doing is observing any merge notifications or locking any of the managed object contexts or the persistent store coordinator. Is this something I should be doing? I'm a bit unsure how to architect this all correctly so would appreciate any advice.

    Read the article

  • ASP.Net Architecture Specific to Shared/Static functions

    - by Maxim Gershkovich
    Hello All, Could someone please advise in the context of a ASP.Net application is a shared/static function common to all users? If for example you have a function Public shared function GetStockByID(StockID as Guid) as Stock Is that function common to all current users of your application? Or is the shared function only specific to the current user and shared in the context of ONLY that current user? So more specifically my question is this, besides database concurrency issues such as table locking do I need to concern myself with threading issues in shared functions in an ASP.Net application? In my head; let’s say my application namespace is MyTestApplicationNamespace. Everytime a new user connects to my site a new instance of the MyTestApplicationNamespace is created and therefore all shared functions are common to that instance and user but NOT common across multiple users. Is this correct?

    Read the article

  • .NET Deployment of Interface/Classes for Command Pattern Question

    - by Jonno
    In theory I would like to produce 2 projects: 1) Asp.net (Sever A) 2) DAL running (Server B) I would like to utilise command objects to comunicate with the DAL. ASP.net instantiates a command class e.g. CmdGetAllUsers which impliments IMyCommand interface and sends it to the DAL (using ASMX or WCF). My question is: Would the class definition of CmdGetAllUsers need to exist on the DAL server? Or would having the interface definition be enough? My goal is to reduce the need to redeploy the DAL code, and have it as a fairly simple pass-through layer. Many thanks for your time.

    Read the article

  • Need help helping in converting jquery, ajax, json and asp.net

    - by Haja Mohaideen
    I am tying out this tutorial, http://www.ezzylearning.com/tutorial.aspx?tid=5869127. It works perfectly. What I am now trying to do is to host the aspx contents as html file. This html file is hosted on my wampserver which is on my laptop. The asp.net code hosted on my test server. When I try to access, I get the following error, Resource interpreted as Script but transferred with MIME type text/html: "http://201.x.x.x/testAjax/Default.aspx/AddProductToCart?callback=jQuery17103264484549872577_1346923699990&{%20pID:%20%226765%22,%20qty:%20%22100%22,%20lblType:%20%2220%22%20}&_=1346923704482". jquery.min.js:4 Uncaught SyntaxError: Unexpected token < I am not sure how to solve this problem. index.html code $(function () { $('#btnAddToCart').click(function () { var result = $.ajax({ type: "POST", url: "http://202.161.45.124/testAjax/Default.aspx/AddProductToCart", crossDomain: true, data: '{ pID: "6765", qty: "100", lblType: "20" }', contentType: "application/json; charset=utf-8", dataType: "jsonp", success: succeeded, failure: function (msg) { alert(msg); }, error: function (xhr, err) { alert(err); } }); }); }); function succeeded(msg) { alert(msg.d); } function btnAddToCart_onclick() { } </script> </head> <body> <form name="form1" method="post"> <div> <input type="button" id="btnAddToCart" onclick="return btnAddToCart_onclick()" value="Button" /> </div> </form> aspx.vb Imports System.Web.Services Imports System.Web.Script.Services <ScriptService()> Public Class WebForm1 Inherits Page Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Session("test") = "" End Sub <WebMethod()> <ScriptMethod(UseHttpGet:=False, ResponseFormat:=ResponseFormat.Json)> Public Shared Function AddProductToCart(pID As String, qty As String, lblType As String) As String Dim selectedProduct As String = String.Format("+ {0} - {1} - {2}", pID, qty, lblType) HttpContext.Current.Session("test") += selectedProduct Return HttpContext.Current.Session("test").ToString() End Function End Class

    Read the article

  • Entity Data Model Wizzard not creating tables in EDMX file

    - by Shawn
    I'm trying the database first approach by creating an ADO.NET Entity Data Model using the Wizard with the Adventureworks2012 DB. Testing DB connection works, and the connection string is added to the App.Config. I'm selecting all the tables except the ones marked as (dbo) AWBuildVersion, DatabaseLog, and ErrorLog. When the Wizard finishes the .edmx file is blank, and if I view the file in XML view the EntityContainer is empty. I'm using VS 2010 & .NET Framework 4.0

    Read the article

  • Which is the "best" data access framework/approach for C# and .NET?

    - by Frans
    (EDIT: I made it a community wiki as it is more suited to a collaborative format.) There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends". However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application. I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go. So far, this is my understanding at a high level - but I am sure it is wrong... I am primarily focusing on the Microsoft approaches to keep this focused. ADO.NET Entity Framework Database agnostic Good because it allows swapping backends in and out Bad because it can hit performance and database vendors are not too happy about it Seems to be MS's preferred route for the future Complicated to learn (though, see 267357) It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code LINQ to SQL Uncertain future (see Is LINQ to SQL truly dead?) Easy to learn (?) Only works with MS SQL Server See also Pros and cons of LINQ "Standard" ADO.NET No ORM No abstraction so you are back to "roll your own" and play with dynamically generated SQL Direct access, allows potentially better performance This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend. Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures. ASP.NET Data Source Controls Are these something altogether different or just a layer over standard ADO.NET? - Would you really use these if you had a DAL or if you implemented LINQ or Entities? NHibernate Seems to be a very powerful and powerful ORM? Open source Some other relevant links; NHibernate or LINQ to SQL Entity Framework vs LINQ to SQL

    Read the article

  • .NET make a copy of an embedded file resource to the local drive

    - by Matt H.
    Hi, i'm new to the realm to working with Files in .NET I'm creating a WPF application in VB.NET with the 3.5 Framework. (If you provide an example in C#, that's perfectly fine.) In my project I have a Template for an MS Access database. My desired behavior is that when the users clicks File--New, they can create a new copy of this template, give it a filename, and save it to their local directory. The database already has the tables and some starting data needed to interface with my application (a user-friendly data editor) I'm thinking the approach is to include this "template.accdb" file as a resource in the project, and write it to a file somehow at runtime? Any guidance will be very, very appreciated. Thanks!

    Read the article

  • in .net, what programming model would be good for prototyping, but then reusable for production (for

    - by Greg
    Hi, In .NET land what would be a good approach for quick prototyping of a concept (i.e. development just on my PC) that could then be extended out to product (users across LAN/WAN), BUT in a fashion that the model/business logic code and data access layer code can be used as is? One thought for example I had as to do: (a) WinForms with business logic and Entity Framework layer to SQL Server Express on my PC, then (b) Go then to ASP.net (using the business logic / data library) with SQL Server/IIS Any comments? Other suggestions?

    Read the article

  • C# SQL Data Adapter Fill on existing typed Dataset

    - by René
    I have an option to choose between local based data storing (xml file) or SQL Server based. I already created a long time ago a typed dataset for my application to save data local in the xml file. Now, I have a bool that changes between Server based version and local version. If true my application get the data from the SQL Server. I'm not sure but It seems that Sql Adapter's Fill Method can't fill the Data in my existing schema SqlCommand cmd = new SqlCommand("Select * FROM dbo.Categories WHERE CatUserId = 1", _connection); cmd.CommandType = CommandType.Text; _sqlAdapter = new SqlDataAdapter(cmd); _sqlAdapter.TableMappings.Add("Categories", "dbo.Categories"); _sqlAdapter.Fill(Program.Dataset); This should fill my data from dbo.Categories to Categories (in my local, typed dataset). but it doesn't. It creates a new table with the name "Table". It looks like it can't handle the existing schema. I can't figure it out. Where is the problem? btw. of course the database request I do isn't very useful that way. It's just a simplified version for testing...

    Read the article

  • ASP.NET - How to edit 'bit' data type?

    - by Peter
    I am coding in Visual Basic. I am using a checkbox control. Now depending on its checked property I need to set/unset a bit column in a SQL Server database. Here's the code: Try conSQL.Open() Dim cmd As New SqlCommand("update Student set send_mail = " + _ sendemailCheckBox.Checked.ToString + " where student_id = '" _ + sidnolabel.Text + "'", conSQL) cmd.ExecuteNonQuery() Finally conSQL.Close() End Try The send_mail attribute is of bit datatype. This code is not working. How do I go about it?

    Read the article

  • .NET application with an Oracle Database

    - by Pavitar
    I have to code a desktop application and some dynamic web content. I'm planning to develop it in .NET with an Oracle database, though my dilemma is that my professor says that if I do so, there would be a lot of support issues later on. He says .NET is more compatible with MS SQL and MS Access, and so is the trend, everyone does it.Is it true? I have learnt Oracle so I know of a few features which I wouldn't be able to implement on SQL comfortably because of lack of knowledge of SQL databases.I would love to learn the new syntax but again, I don't have much time to spare.

    Read the article

  • export gridview data

    - by Eric
    What is the best way to export a gridview into an Excel spreadsheet? This seems easy except that my Gridview doesn't have an export attribute. What is the quickest way to do this?

    Read the article

  • How can i read back an object stored in a session?

    - by strakastroukas
    First of all, here comes the load part. Structure MainStruct Dim Ans1 As String Dim Ans2 As String End Structure Dim Build As New List(Of MainStruct) ... ... ... Session("MyData") = Build The question is how can i read back the contents of the list, stored in the Session? I mean something like... Build = Session("MyData").Ans1

    Read the article

  • A cross-platform application WPF, ASP.NET, Silverlight, WP7, XAML

    - by J. Lennon
    Considering the fact that all applications will interact with the web project (which will use the cloud or web services).. Is there any way to share my class models between applications? If yes, what is the best way to do it? About sending / receiving data from the Webservice, serialize and deserialize, how can I do this in a simple way without having to manually populate the objects? Any information about this applications would be really helpful!

    Read the article

  • MVC2 AJAX - determining UpdateTargetId based on the returned data

    - by DanielJW
    The scenario: I'm creating a login form for an MVC2 application. How i'm doing it: The form submits to an MVC2 action which validates the username/password. If it fails validation the action returns the form (a partial view) for the user to try again. If it passes validation the action returns the page the user was visiting before they logged in (a view). What i want to happen: 1 - when the form is submitted and the user validates successfully, The returned result should replace the current page (like what happens if you don't set an UpdateTargetId). 2 - When the form is submitted and the user fails validation, the returned result should replace the form (like what happens if you set the UpdateTargetID to the form's containing element). The problem: I can make both of those things work, but not at the same time. I can either have it always replace the current page, or always just replace the contents of the UpdateTargetId element. But I need it to be able to do either depending on whether the user successfully validated or not. What I need The ideal solution would be to be able to examine the result of the ajax request and determine whether to use the UpdateTargetId (replacing just the form) or not (replacing the whole page). I expect it would involve some work with jquery (assuming it's possible) but i'm not really that great with jquery yet to figure out how to do it myself. If it can't be done this way I'm also open to other methods/solutions for making it work in a similar fashion. Thanks in advance ..

    Read the article

  • Programação paralela no .NET Framework 4 – Parte II

    - by anobre
    Olá pessoal, tudo bem? Este post é uma continuação da série iniciada neste outro post, sobre programação paralela. Meu objetivo hoje é apresentar o PLINQ, algo que poderá ser utilizado imediatamente nos projetos de vocês. Parallel LINQ (PLINQ) PLINQ nada mais é que uma implementação de programação paralela ao nosso famoso LINQ, através de métodos de extensão. O LINQ foi lançado com a versão 3.0 na plataforma .NET, apresentando uma maneira muito mais fácil e segura de manipular coleções IEnumerable ou IEnumerable<T>. O que veremos hoje é a “alteração” do LINQ to Objects, que é direcionado a coleções de objetos em memória. A principal diferença entre o LINQ to Objects “normal” e o paralelo é que na segunda opção o processamento é realizado tentando utilizar todos os recursos disponíveis para tal, obtendo uma melhora significante de performance. CUIDADO: Nem todas as operações ficam mais rápidas utilizando recursos de paralelismo. Não deixe de ler a seção “Performance” abaixo. ParallelEnumerable Tudo que a gente precisa para este post está organizado na classe ParallelEnumerable. Esta classe contém os métodos que iremos utilizar neste post, e muito mais: AsParallel AsSequential AsOrdered AsUnordered WithCancellation WithDegreeOfParallelism WithMergeOptions WithExecutionMode ForAll … O exemplo mais básico de como executar um código PLINQ é utilizando o métodos AsParallel, como o exemplo: var source = Enumerable.Range(1, 10000); var evenNums = from num in source.AsParallel() where Compute(num) > 0 select num; Algo tão interessante quanto esta facilidade é que o PLINQ não executa sempre de forma paralela. Dependendo da situação e da análise de alguns itens no cenário de execução, talvez seja mais adequado executar o código de forma sequencial – e nativamente o próprio PLINQ faz esta escolha.  É possível forçar a execução para sempre utilizar o paralelismo, caso seja necessário. Utilize o método WithExecutionMode no seu código PLINQ. Um teste muito simples onde podemos visualizar a diferença é demonstrado abaixo: static void Main(string[] args) { IEnumerable<int> numbers = Enumerable.Range(1, 1000); IEnumerable<int> results = from n in numbers.AsParallel() where IsDivisibleByFive(n) select n; Stopwatch sw = Stopwatch.StartNew(); IList<int> resultsList = results.ToList(); Console.WriteLine("{0} itens", resultsList.Count()); sw.Stop(); Console.WriteLine("Tempo de execução: {0} ms", sw.ElapsedMilliseconds); Console.WriteLine("Fim..."); Console.ReadKey(true); } static bool IsDivisibleByFive(int i) { Thread.SpinWait(2000000); return i % 5 == 0; }   Basta remover o AsParallel da instrução LINQ que você terá uma noção prática da diferença de performance. 1. Instrução utilizando AsParallel   2. Instrução sem utilizar paralelismo Performance Apesar de todos os benefícios, não podemos utilizar PLINQ sem conhecer todos os seus detalhes. Lembre-se de fazer as perguntas básicas: Eu tenho trabalho suficiente que justifique utilizar paralelismo? Mesmo com o overhead do PLINQ, vamos ter algum benefício? Por este motivo, visite este link e conheça todos os aspectos, antes de utilizar os recursos disponíveis. Conclusão Utilizar recursos de paralelismo é ótimo, aumenta a performance, utiliza o investimento realizado em hardware – tudo isso sem custo de produtividade. Porém, não podemos usufruir de qualquer tipo de tecnologia sem conhece-la a fundo antes. Portanto, faça bom uso, mas não esqueça de manter o conhecimento a frente da empolgação. Abraços.

    Read the article

  • Error converting JSON to .Net object in asp.net

    - by Vinni
    Hello Guys, I am unable to convert JSON string to .net object in asp.net. I am sending JSON string from client to server using hidden field (by keeping the JSON object.Tostring() in hidden field and reading the hidden field value in code behind file) Json string/ Object: [[{"OfferId":"1","OrderValue":"11","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"11","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"11","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"2","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"2","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"67","HostingTypeID":"3"}, {"OfferId":"1","OrderValue":"67","HostingTypeID":"3"}], [{"OfferId":"1","OrderValue":"99","HostingTypeID":"6"}], [{"OfferId":"1","OrderValue":"10","HostingTypeID":"8"}]] .Net Object public class JsonFeaturedOffer { public string OfferId { get; set; } public string OrderValue { get; set; } public string HostingTypeID { get; set; } } Converstion code in code behind file byte[] byteArray = Encoding.ASCII.GetBytes(HdnJsonData.Value); MemoryStream stream = new MemoryStream(byteArray); DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(JsonFeaturedOffer)); object result= serializer.ReadObject(stream); JsonFeaturedOffer jsonObj = result as JsonFeaturedOffer; While converting i am getting following error: Expecting element 'root' from namespace ''.. Encountered 'None' with name '', namespace ''.

    Read the article

  • Convert asp.net webforms logic to asp.net MVC

    - by gmcalab
    I had this code in an old asp.net webforms app to take a MemoryStream and pass it as the Response showing a PDF as the response. I am now working with an asp.net MVC application and looking to do this this same thing, but how should I go about showing the MemoryStream as PDF using MVC? Here's my asp.net webforms code: private void ShowPDF(MemoryStream ms) { try { //get byte array of pdf in memory byte[] fileArray = ms.ToArray(); //send file to the user Page.Response.Cache.SetCacheability(HttpCacheability.NoCache); Page.Response.Buffer = true; Response.Clear(); Response.ClearContent(); Response.ClearHeaders(); Response.Charset = string.Empty; Response.ContentType = "application/pdf"; Response.AddHeader("content-length", fileArray.Length.ToString()); Response.AddHeader("Content-Disposition", "attachment;filename=TID.pdf;"); Response.BinaryWrite(fileArray); Response.Flush(); Response.Close(); } catch { // and boom goes the dynamite... } }

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >