Search Results

Search found 6744 results on 270 pages for 'linq to entities'.

Page 96/270 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • .NET database enumeration

    - by erasmus
    In a project, one of my entities is House which has many enumeration properties (for example housetype). Using .NET, Linq to Sql and Sql Server how can I create a db with enumeration and use it with Linq to Sql? What should be my approach?

    Read the article

  • How to approach performance issues?

    - by jess
    Hi, We are developing a client-server desktop application(winforms with sql server 2008, using LINQ-SQL).We are now finding many issues related to performance.These relate to querying too much data with LINQ , bad database design,not much caching etc.What do you suggest,we should do - how to go about solving these performance issues? One thing,I am doing is doing sql profiling,and trying to fix some queries.As far caching is concerned,we have static lists.But,how to keep them updated,we don't have any server side implementation.So,these lists can be stale,if someone changes data. regards

    Read the article

  • How do I get results from a link query in the order of IDs that I provide?

    - by Keltex
    I'm looking to get query results back from Linq in the order that I pass IDs to the query. So it would look something like this: var IDs = new int [] { 5, 20, 10 } var items = from mytable in db.MyTable where IDs.Contains(mytable.mytableID) orderby // not sure what to do here select mytable; I'm hoping to get items in the order of IDs (5, 20, 10). (Note this is similar to this question, but I would like to do it in Linq instead of SQL)

    Read the article

  • c#: Design advice. Using DataTable or List<MyObject> for a generic rule checker

    - by Andrew White
    Hi, I have about 100,000 lines of generic data. Columns/Properties of this data are user definable and are of the usual data types (string, int, double, date). There will be about 50 columns/properties. I have 2 needs: To be able to calculate new columns/properties using an expression e.g. Column3 = Column1 * Column2. Ultimately I would like to be able to use external data using a callback, e.g. Column3 = Column1 * GetTemperature The expression is relatively simple, maths operations, sum, count & IF are the only necessary functions. To be able to filter/group the data and perform aggregations e.g. Sum(Data.Column1) Where(Data.Column2 == "blah") As far as I can see I have two options: 1. Using a DataTable. = Point 1 above is achieved by using DataColumn.Expression = Point 2 above is acheived by using DataTable.DefaultView.RowFilter & C# code 2. Using a List of generic Objects each with a Dictionary< string, object to store the values. = Point 1 could be achieved by something like NCalc = Point 2 is achieved using LINQ DataTable: Pros: DataColumn.Expression is inbuilt Cons: RowFilter & coding c# is not as "nice" as LINQ, DataColumn.Expression does not support callbacks(?) = workaround could be to get & replace external value when creating the calculated column GenericList: Pros: LINQ syntax, NCalc supports callbacks Cons: Implementing NCalc/generic calc engine Based on the above I would think a GenericList approach would win, but something I have not factored in is the performance which for some reason I think would be better with a datatable. Does anyone have a gut feeling / experience with LINQ vs. DataTable performance? How about NCalc? As I said there are about 100,000 rows of data, with 50 columns, of which maybe 20 are calculated. In total about 50 rules will be run against the data, so in total there will be 5 million row/object scans. Would really appreciate any insights. Thx. ps. Of course using a database + SQL & Views etc. would be the easiest solution, but for various reasons can't be implemented.

    Read the article

  • Rewrite SQL Fulltext Function to return Table only

    - by Alex
    I have a MS SQL Fulltext Function like this: (...) RETURNS TABLE AS RETURN SELECT * FROM fishes INNER JOIN CONTAINSTABLE(fishes, *, @keywords, @limit) AS KEY_TBL ON fishes.id = KEY_TBL.[KEY] When I use this function in LINQ, it generates a special return type which includes all fields of my "fishes" table, plus Key and Rank. How could I rewrite above query, or change something in LINQ, to omit Key and Rank and just return my "fishes" results (and to have the fulltext search result objects be of type Fish, which is what I really care about, so I don't have to cast)?

    Read the article

  • How to get an object from a list based upon IEqualityComparer<T>

    - by Greg
    The Compare method in Linq lets you find by an IEqualityComparer, but I can't find a counterpart method that allows you retrieve an item by the same comparer. Is this really the best way to do it? MyItem myFinderItem = new MyItem(keyField1, keyField2); if (myList.Contains(myFinderItem, new MyEqualityComparer())) { MyItem myRealItem = myList.Single(item => new MyEqualityComparer().Equals(item , myFinderItem)); } (I'm sharing the usage of the IEqualityComaprer with a call to the Except Linq method and I'd like to maintain a single source for equality comparisons)

    Read the article

  • Is opening too many datacontexts bad?

    - by ryudice
    I've been checking my application with linq 2 sql profiler, and I noticed that it opens a lot of datacontexts, most of them are opened by the linq datasource I used, since my repositories use only the instance stored in Request.Items, is it bad to open too many datacontext? and how can I make my linqdatasource to use the datacontext that I store in Request.Items for the duration of the request? thanks for any help!

    Read the article

  • Helping linqtosql datacontext use implicit conversion between varchar column in the database and tab

    - by user213256
    I am creating an mssql database table, "Orders", that will contain a varchar(50) field, "Value" containing a string that represents a slightly complex data type, "OrderValue". I am using a linqtosql datacontext class, which automatically types the "Value" column as a string. I gave the "OrderValue" class implicit conversion operators to and from a string, so I can easily use implicit conversion with the linqtosql classes like this: // get an order from the orders table MyDataContext db = new MyDataContext(); Order order = db.Orders(o => o.id == 1); // use implicit converstion to turn the string representation of the order // value into the complex data type. OrderValue value = order.Value; // adjust one of the fields in the complex data type value.Shipping += 10; // use implicit conversion to store the string representation of the complex // data type back in the linqtosql order object order.Value = value; // save changes db.SubmitChanges(); However, I would really like to be able to tell the linqtosql class to type this field as "OrderValue" rather than as "string". Then I would be able to avoid complex code and re-write the above as: // get an order from the orders table MyDataContext db = new MyDataContext(); Order order = db.Orders(o => o.id == 1); // The Value field is already typed as the "OrderValue" type rather than as string. // When a string value was read from the database table, it was implicity converted // to "OrderValue" type. order.Value.Shipping += 10; // save changes db.SubmitChanges(); In order to achieve this desired goal, I looked at the datacontext designer and selected the "Value" field of the "Order" table. Then, in properties, I changed "Type" to "global::MyApplication.OrderValue". The "Server Data Type" property was left as "VarChar(50) NOT NULL" The project built without errors. However, when reading from the database table, I was presented with the following error message: Could not convert from type 'System.String' to type 'MyApplication.OrderValue'. at System.Data.Linq.DBConvert.ChangeType(Object value, Type type) at Read_Order(ObjectMaterializer1 ) at System.Data.Linq.SqlClient.ObjectReaderCompiler.ObjectReader2.MoveNext() at System.Linq.Buffer1..ctor(IEnumerable1 source) at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source) at Example.OrdersProvider.GetOrders() at ... etc From the stack trace, I believe this error is happening while reading the data from the table. When presented with converting a string to my custom data type, even though the implicit conversion operators are present, the DBConvert class gets confused and throws an error. Is there anything I can do to help it not get confused and do the implicit conversion? Thanks in advance, and apologies if I have posted in the wrong forum. cheers / Ben

    Read the article

  • Filter to values in collection in one query

    - by dotnetdev
    Hi, I have the following LINQ query: List<string> Types = (List<string>) Directory.GetFiles(@"C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727") Where(x => System.IO.Path.GetFileNameWithoutExtension(x).Contains("Microsoft")) .ToList<string>(); How could I modify this so it can only get the values stored in a collection, without writing another LINQ query (which I assume will impact performance?)? Thanks

    Read the article

  • Intersection() and Except() is too slow with large collections of custom objects

    - by Theo
    I am importing data from another database. My process is importing data from a remote DB into a List<DataModel> named remoteData and also importing data from the local DB into a List<DataModel> named localData. I am then using LINQ to create a list of records that are different so that I can update the local DB to match the data pulled from remote DB. Like this: var outdatedData = this.localData.Intersect(this.remoteData, new OutdatedDataComparer()).ToList(); I am then using LINQ to create a list of records that no longer exist in remoteData, but do exist in localData, so that I delete them from local database. Like this: var oldData = this.localData.Except(this.remoteData, new MatchingDataComparer()).ToList(); I am then using LINQ to do the opposite of the above to add the new data to the local database. Like this: var newData = this.remoteData.Except(this.localData, new MatchingDataComparer()).ToList(); Each collection imports about 70k records, and each of the 3 LINQ operation take between 5 - 10 minutes to complete. How can I make this faster? Here is the object the collections are using: internal class DataModel { public string Key1{ get; set; } public string Key2{ get; set; } public string Value1{ get; set; } public string Value2{ get; set; } public byte? Value3{ get; set; } } The comparer used to check for outdated records: class OutdatedDataComparer : IEqualityComparer<DataModel> { public bool Equals(DataModel x, DataModel y) { var e = string.Equals(x.Key1, y.Key1) && string.Equals(x.Key2, y.Key2) && ( !string.Equals(x.Value1, y.Value1) || !string.Equals(x.Value2, y.Value2) || x.Value3 != y.Value3 ); return e; } public int GetHashCode(DataModel obj) { return 0; } } The comparer used to find old and new records: internal class MatchingDataComparer : IEqualityComparer<DataModel> { public bool Equals(DataModel x, DataModel y) { return string.Equals(x.Key1, y.Key1) && string.Equals(x.Key2, y.Key2); } public int GetHashCode(DataModel obj) { return 0; } }

    Read the article

  • NHibernate: Mapping different dynamic components based on a discriminator

    - by George Mauer
    My domain entities each have a set of "fixed" properties and a set of "dynamic" properties which can be added at runtime. I handle this by using NHibernate's dynamic-component functionality. public class Product { public virtual Guid Id { get; } public virtual string Name { get; set;} public virtual IDictionary DynamicComponents { get; } } Now I have the following situation public class Customer { public virtual Guid Id { get; } public virtual string Type { get; set;} public virtual IDictionary DynamicProperties { get; } } Where a CustomerType is something like "Online" or "InPerson". Furthermore an Online customer has dynamic properties "Name" and "IPAddress" and an InPerson Customer has dynamic properties "Name" and "Salesman". Which customer types are available and the extra properties on them are configured in meta-data which is used to generate hbm files on application start. I could figure out some way to knock this together using an intermediate DTO layer, but is there any support in NHibernate for this scenario? The only difficulty seems to be that all the different "types" of customer map to the same Customer class.

    Read the article

  • Using an existing IQueryable to create a new dynamic IQueryable

    - by dnoxs
    I have a query as follows: var query = from x in context.Employees where (x.Salary > 0 && x.DeptId == 5) || x.DeptId == 2 order by x.Surname select x; The above is the original query and returns let's say 1000 employee entities. I would now like to use the first query to deconstruct it and recreate a new query that would look like this: var query = from x in context.Employees where ((x.Salary > 0 && x.DeptId == 5) || x.DeptId == 2) && (x,i) i % 10 == 0 order by x.Surname select x.Surname; This query would return 100 surnames. The syntax is probably incorrect, but what I need to do is attach an additional where clause and modify the select to a single field. I've been looking into the ExpressionVisitor but I'm not entirely sure how to create a new query based on an existing query. Any guidance would be appreciated. Thanks you.

    Read the article

  • Different EF Property DataType than Storage Layer Possible?

    - by dj_kyron
    Hi, I am putting together a WCF Data Service for PatientEntities using Entity Framework. My solution needs to address these requirements: Property DateOfBirth of entity Patient is stored in SQL Server as string. It would be ideal if the entity class did not also use the "string" type but rather a DateTime type. (I would expect this to be possible since we're abstracting away from the storage layer). Where could a conversion mechanism be put in place that would convert to and from DateTime/string so that the entity and SQL Server are in sync?. I cannot change the storage layer's structure, so I have to work around it. WCF Data Services (Read-only, so no need for saving changes) need to be used since clients will be able to use LINQ expressions to consume the service. They can generate results based on any given query scenario they need and not be constrained by a single method such as GetPatient(int ID). I've tried to use DTOs, but run into problem of mapping the ObjectContext to a DTO, I don't think that is theoretically possible...or too complicated if it is. I've tried to use Self Tracking Entities but they require the metadata from the .edmx file if I'm correct, and this isn't allowing a different property data type. I also want to add customizations to my Entity getter methods so that a property "MRN" of type "string" needs to have .Replace("MR~", string.Empty) performed before it is returned. I can add this to the getter methods but the problem with that is Entity Framework will overwrite that next time it refreshes the entity classes. Is there a permanent place I can put these? Should I use POCO instead? How would that work with WCF Data Services? Where would the service grab the metadata?

    Read the article

  • Entity Framework + MySQL - Why is the performance so terrible?

    - by Cyril Gupta
    When I decided to use an OR/M (Entity Framework for MySQL this time) for my new project I was hoping it would save me time, but I seem to have failed it (for the second time now). Take this simple SQL Query SELECT * FROM POST ORDER BY addedOn DESC LIMIT 0, 50 It executes and gives me results in less than a second as it should (the table has about 60,000 rows). Here's the equivalent LINQ To Entities query that I wrote for this var q = (from p in db.post orderby p.addedOn descending select p).Take(50); var q1 = q.ToList(); //This is where the query is fetched and timed out But this query never even executes it times out ALWAYS (without orderby it takes 5 seconds to run)! My timeout is set to 12 seconds so you can imagine it is taking much more than that. Why is this happening? Is there a way I can see what is the actual SQL Query that Entity Framework is sending to the db? Should I give up on EF+MySQL and move to standard SQL before I lose all eternity trying to make it work? I've recalibrated my indexes, tried eager loading (which actually makes it fail even without the orderby clause) Please help, I am about to give up OR/M for MySQL as a lost cause.

    Read the article

  • EF4 + STE: Reattaching via a WCF Service? Using a new objectcontext each and every time?

    - by Martin
    Hi there, I am planning to use WCF (not ria) in conjunction with Entity Framework 4 and STE (Self tracking entitites). If i understnad this correctly my WCF should return an entity or collection of entities (using LIST for example and not IQueryable) to the client (in my case silverlight) The client then can change the entity or update it. At this point i believe it is self tracking???? This is where i sort of get a bit confused as there are a lot of reported problems with STEs not tracking.. Anyway... Then to update i just need to send back the entity to my WCF service on another method to do the update. I should be creating a new OBJECTCONTEXT everytime? In every method? If i am creaitng a new objectcontext everytime in everymethod on my WCF then don't i need to re-attach the STE to the objectcontext? So basically this alone wouldn't work?? using(var ctx = new MyContext()) { ctx.Orders.ApplyChanges(order); ctx.SaveChanges(); } Or should i be creating the object context once in the constructor of the WCF service so that 1 call and every additional call using the same wcf instance uses the same objectcontext? I could create and destroy the wcf service in each method call from the client - hence creating in effect a new objectcontext each time. I understand that it isn't a good idea to keep the objectcontext alive for very long. Any insight or information would be gratefully appreciated thanks

    Read the article

  • Why do I get this exception? {An item with the same key has already been added."})

    - by Alan
    Aknittel NewSellerID is the result of a lookup on tblSellers. These tables (tblSellerListings and tblSellers) are not "officially" joined with a foreign key relationship, either in the model or in the database, but I want some referential integrity maintained for the future. So my issue remains. Why do I get the exception ({"An item with the same key has already been added."}) with this code, if I don't begin each iteration of the foreach loop with a new ObjectContext and end it with SaveChanges, which I think will affect performance. Also, could you tell me why ORCSolutionsDataService.tblSellerListings (An ADO.NET DataServices/WCF object is not IDisposable, like LINQ to Entities?? ============================================== // Add listings to previous seller int NewSellerID = 0; // Look up existing Seller key using SellerUniqueEBAYID var qryCurrentSeller = from s in service.tblSellers where s.SellerEBAYUserID == SellerUserID select s; foreach (var s in qryCurrentSeller) NewSellerID = s.SellerID; // Save the selected listings for this seller foreach (DataGridViewRow dgr in dgvRows) { ORCSolutionsDataService.tblSellerListings NewSellerListing = new ORCSolutionsDataService.tblSellerListings(); NewSellerListing.ItemID = dgr.Cells["txtSellerItemID"].Value.ToString(); NewSellerListing.Title = dgr.Cells["txtSellerItemTitle"].Value.ToString(); NewSellerListing.CurrentPrice = Convert.ToDecimal(dgr.Cells["txtSellerItemPrice"].Value); NewSellerListing.QuantitySold = Convert.ToInt32(dgr.Cells["txtSellerItemSold"].Value); NewSellerListing.EndTime = Convert.ToDateTime(dgr.Cells["txtSellerItemEnds"].Value); NewSellerListing.CategoryName = dgr.Cells["txtSellerItemCategory"].Value.ToString(); NewSellerListing.ExtendedPrice = Convert.ToDecimal(dgr.Cells["txtExtendedReceipts"].Value); NewSellerListing.RetrievedDtime = Convert.ToDateTime(dtSellerDataRetrieved.ToString()); NewSellerListing.SellerID = NewSellerID; service.AddTotblSellerListings(NewSellerListing); } service.SaveChanges(); } catch (Exception ex) { MessageBox.Show("Unable to add a new case. Exception: " + ex.Message); }

    Read the article

  • JPA entities -- org.hibernate.TypeMismatchException

    - by shane lee
    Environment: JDK 1.6, JEE5 Hibernate Core 3.3.1.GA, Hibernate Annotations 3.4.0.GA DB:Informix Used reverse engineering to create my persistence entities from db schema [NB:This is a schema in work i cannot change] Getting exception when selecting list of basic_auth_accounts org.hibernate.TypeMismatchException: Provided id of the wrong type for class ebusiness.weblogic.model.UserAccounts. Expected: class ebusiness.weblogic.model.UserAccountsId, got class ebusiness.weblogic.model.BasicAuthAccountsId Both basic_auth_accounts and user_accounts have composite primary keys and one-to-one relationships. Any clues what to do here? This is pretty important that i get this to work. Cannot find any substantial solution on the net, some say to create an ID class which hibernate has done, and some say not to have a one-to-one relationship. Please help me!! /** * BasicAuthAccounts generated by hbm2java */ @Entity @Table(name = "basic_auth_accounts", schema = "ebusdevt", catalog = "ebusiness_dev", uniqueConstraints = @UniqueConstraint(columnNames = { "realm_type_id", "realm_qualifier", "account_name" })) public class BasicAuthAccounts implements java.io.Serializable { private BasicAuthAccountsId id; private UserAccounts userAccounts; private String accountName; private String hashedPassword; private boolean passwdChangeReqd; private String hashMethodId; private int failedAttemptNo; private Date failedAttemptDate; private Date lastAccess; public BasicAuthAccounts() { } public BasicAuthAccounts(UserAccounts userAccounts, String accountName, String hashedPassword, boolean passwdChangeReqd, String hashMethodId, int failedAttemptNo) { this.userAccounts = userAccounts; this.accountName = accountName; this.hashedPassword = hashedPassword; this.passwdChangeReqd = passwdChangeReqd; this.hashMethodId = hashMethodId; this.failedAttemptNo = failedAttemptNo; } public BasicAuthAccounts(UserAccounts userAccounts, String accountName, String hashedPassword, boolean passwdChangeReqd, String hashMethodId, int failedAttemptNo, Date failedAttemptDate, Date lastAccess) { this.userAccounts = userAccounts; this.accountName = accountName; this.hashedPassword = hashedPassword; this.passwdChangeReqd = passwdChangeReqd; this.hashMethodId = hashMethodId; this.failedAttemptNo = failedAttemptNo; this.failedAttemptDate = failedAttemptDate; this.lastAccess = lastAccess; } @EmbeddedId @AttributeOverrides( { @AttributeOverride(name = "realmTypeId", column = @Column(name = "realm_type_id", nullable = false, length = 32)), @AttributeOverride(name = "realmQualifier", column = @Column(name = "realm_qualifier", nullable = false, length = 32)), @AttributeOverride(name = "accountId", column = @Column(name = "account_id", nullable = false)) }) public BasicAuthAccountsId getId() { return this.id; } public void setId(BasicAuthAccountsId id) { this.id = id; } @OneToOne(fetch = FetchType.LAZY) @PrimaryKeyJoinColumn @NotNull public UserAccounts getUserAccounts() { return this.userAccounts; } public void setUserAccounts(UserAccounts userAccounts) { this.userAccounts = userAccounts; } /** * BasicAuthAccountsId generated by hbm2java */ @Embeddable public class BasicAuthAccountsId implements java.io.Serializable { private String realmTypeId; private String realmQualifier; private long accountId; public BasicAuthAccountsId() { } public BasicAuthAccountsId(String realmTypeId, String realmQualifier, long accountId) { this.realmTypeId = realmTypeId; this.realmQualifier = realmQualifier; this.accountId = accountId; } /** * UserAccounts generated by hbm2java */ @Entity @Table(name = "user_accounts", schema = "ebusdevt", catalog = "ebusiness_dev") public class UserAccounts implements java.io.Serializable { private UserAccountsId id; private Realms realms; private UserDetails userDetails; private Integer accessLevel; private String status; private boolean isEdge; private String role; private boolean chargesAccess; private Date createdTimestamp; private Date lastStatusChangeTimestamp; private BasicAuthAccounts basicAuthAccounts; private Set<Sessions> sessionses = new HashSet<Sessions>(0); private Set<AccountGroups> accountGroupses = new HashSet<AccountGroups>(0); private Set<UserPrivileges> userPrivilegeses = new HashSet<UserPrivileges>(0); public UserAccounts() { } public UserAccounts(UserAccountsId id, Realms realms, UserDetails userDetails, String status, boolean isEdge, boolean chargesAccess) { this.id = id; this.realms = realms; this.userDetails = userDetails; this.status = status; this.isEdge = isEdge; this.chargesAccess = chargesAccess; } @EmbeddedId @AttributeOverrides( { @AttributeOverride(name = "realmTypeId", column = @Column(name = "realm_type_id", nullable = false, length = 32)), @AttributeOverride(name = "realmQualifier", column = @Column(name = "realm_qualifier", nullable = false, length = 32)), @AttributeOverride(name = "accountId", column = @Column(name = "account_id", nullable = false)) }) @NotNull public UserAccountsId getId() { return this.id; } public void setId(UserAccountsId id) { this.id = id; } @OneToOne(fetch = FetchType.LAZY, mappedBy = "userAccounts") public BasicAuthAccounts getBasicAuthAccounts() { return this.basicAuthAccounts; } public void setBasicAuthAccounts(BasicAuthAccounts basicAuthAccounts) { this.basicAuthAccounts = basicAuthAccounts; } /** * UserAccountsId generated by hbm2java */ @Embeddable public class UserAccountsId implements java.io.Serializable { private String realmTypeId; private String realmQualifier; private long accountId; public UserAccountsId() { } public UserAccountsId(String realmTypeId, String realmQualifier, long accountId) { this.realmTypeId = realmTypeId; this.realmQualifier = realmQualifier; this.accountId = accountId; } @Column(name = "realm_type_id", nullable = false, length = 32) @NotNull @Length(max = 32) public String getRealmTypeId() { return this.realmTypeId; } public void setRealmTypeId(String realmTypeId) { this.realmTypeId = realmTypeId; } @Column(name = "realm_qualifier", nullable = false, length = 32) @NotNull @Length(max = 32) public String getRealmQualifier() { return this.realmQualifier; } public void setRealmQualifier(String realmQualifier) { this.realmQualifier = realmQualifier; } @Column(name = "account_id", nullable = false) public long getAccountId() { return this.accountId; } public void setAccountId(long accountId) { this.accountId = accountId; } Main Code for classes are:

    Read the article

  • Overflow exception while performing parallel factorization using the .NET Task Parallel Library (TPL

    - by Aviad P.
    Hello, I'm trying to write a not so smart factorization program and trying to do it in parallel using TPL. However, after about 15 minutes of running on a core 2 duo machine, I am getting an aggregate exception with an overflow exception inside it. All the entries in the stack trace are part of the .NET framework, the overflow does not come from my code. Any help would be appreciated in figuring out why this happens. Here's the commented code, hopefully it's simple enough to understand: class Program { static List<Tuple<BigInteger, int>> factors = new List<Tuple<BigInteger, int>>(); static void Main(string[] args) { BigInteger theNumber = BigInteger.Parse( "653872562986528347561038675107510176501827650178351386656875178" + "568165317809518359617865178659815012571026531984659218451608845" + "719856107834513527"); Stopwatch sw = new Stopwatch(); bool isComposite = false; sw.Start(); do { /* Print out the number we are currently working on. */ Console.WriteLine(theNumber); /* Find a factor, stop when at least one is found (using the Any operator). */ isComposite = Range(theNumber) .AsParallel() .Any(x => CheckAndStoreFactor(theNumber, x)); /* Of the factors found, take the one with the lowest base. */ var factor = factors.OrderBy(x => x.Item1).First(); Console.WriteLine(factor); /* Divide the number by the factor. */ theNumber = BigInteger.Divide( theNumber, BigInteger.Pow(factor.Item1, factor.Item2)); /* Clear the discovered factors cache, and keep looking. */ factors.Clear(); } while (isComposite); sw.Stop(); Console.WriteLine(isComposite + " " + sw.Elapsed); } static IEnumerable<BigInteger> Range(BigInteger squareOfTarget) { BigInteger two = BigInteger.Parse("2"); BigInteger element = BigInteger.Parse("3"); while (element * element < squareOfTarget) { yield return element; element = BigInteger.Add(element, two); } } static bool CheckAndStoreFactor(BigInteger candidate, BigInteger factor) { BigInteger remainder, dividend = candidate; int exponent = 0; do { dividend = BigInteger.DivRem(dividend, factor, out remainder); if (remainder.IsZero) { exponent++; } } while (remainder.IsZero); if (exponent > 0) { lock (factors) { factors.Add(Tuple.Create(factor, exponent)); } } return exponent > 0; } } Here's the exception thrown: Unhandled Exception: System.AggregateException: One or more errors occurred. --- > System.OverflowException: Arithmetic operation resulted in an overflow. at System.Linq.Parallel.PartitionedDataSource`1.ContiguousChunkLazyEnumerator.MoveNext(T& currentElement, Int32& currentKey) at System.Linq.Parallel.AnyAllSearchOperator`1.AnyAllSearchOperatorEnumerator`1.MoveNext(Boolean& currentElement, Int32& currentKey) at System.Linq.Parallel.StopAndGoSpoolingTask`2.SpoolingWork() at System.Linq.Parallel.SpoolingTaskBase.Work() at System.Linq.Parallel.QueryTask.BaseWork(Object unused) at System.Linq.Parallel.QueryTask.<.cctor>b__0(Object o) at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.Execute() --- End of inner exception stack trace --- at System.Linq.Parallel.QueryTaskGroupState.QueryEnd(Boolean userInitiatedDispose) at System.Linq.Parallel.SpoolingTask.SpoolStopAndGo[TInputOutput,TIgnoreKey](QueryTaskGroupState groupState, PartitionedStream`2 partitions, SynchronousChannel`1[] channels, TaskScheduler taskScheduler) at System.Linq.Parallel.DefaultMergeHelper`2.System.Linq.Parallel.IMergeHelper<TInputOutput>.Execute() at System.Linq.Parallel.MergeExecutor`1.Execute[TKey](PartitionedStream`2 partitions, Boolean ignoreOutput, ParallelMergeOptions options, TaskScheduler taskScheduler, Boolean isOrdered, CancellationState cancellationState, Int32 queryId) at System.Linq.Parallel.PartitionedStreamMerger`1.Receive[TKey](PartitionedStream`2 partitionedStream) at System.Linq.Parallel.AnyAllSearchOperator`1.WrapPartitionedStream[TKey](PartitionedStream`2 inputStream, IPartitionedStreamRecipient`1 recipient, BooleanpreferStriping, QuerySettings settings) at System.Linq.Parallel.UnaryQueryOperator`2.UnaryQueryOperatorResults.ChildResultsRecipient.Receive[TKey](PartitionedStream`2 inputStream) at System.Linq.Parallel.ScanQueryOperator`1.ScanEnumerableQueryOperatorResults.GivePartitionedStream(IPartitionedStreamRecipient`1 recipient) at System.Linq.Parallel.UnaryQueryOperator`2.UnaryQueryOperatorResults.GivePartitionedStream(IPartitionedStreamRecipient`1 recipient) at System.Linq.Parallel.QueryOperator`1.GetOpenedEnumerator(Nullable`1 mergeOptions, Boolean suppressOrder, Boolean forEffect, QuerySettings querySettings) at System.Linq.Parallel.QueryOpeningEnumerator`1.OpenQuery() at System.Linq.Parallel.QueryOpeningEnumerator`1.MoveNext() at System.Linq.Parallel.AnyAllSearchOperator`1.Aggregate() at System.Linq.ParallelEnumerable.Any[TSource](ParallelQuery`1 source, Func`2 predicate) at PFact.Program.Main(String[] args) in d:\myprojects\PFact\PFact\Program.cs:line 34 Any help would be appreciated. Thanks!

    Read the article

  • How can I Include Multiples Tables in my linq to entities eager loading using mvc4 C#

    - by EBENEZER CURVELLO
    I have 6 classes and I try to use linq to Entities to get the SiglaUF information of the last deeper table (in the view - MVC). The problem is I receive the following error: "The ObjectContext instance has been disposed and can no longer be used for operations that require a connection." The view is like that: > @model IEnumerable<DiskPizzaDelivery.Models.EnderecoCliente> > @foreach (var item in Model) { > @Html.DisplayFor(modelItem => item.CEP.Cidade.UF.SiglaUF) > } The query that i use: var cliente = context.Clientes .Include(e => e.Enderecos) .Include(e1 => e1.Enderecos.Select(cep => cep.CEP)) .SingleOrDefault(); The question is: How Can I improve this query to pre loading (eager loading) "Cidade" and "UF"? See below the classes: public partial class Cliente { [Key] [DatabaseGeneratedAttribute(DatabaseGeneratedOption.Identity)] public int IdCliente { get; set; } public string Email { get; set; } public string Senha { get; set; } public virtual ICollection<EnderecoCliente> Enderecos { get; set; } } public partial class EnderecoCliente { public int IdEndereco { get; set; } public int IdCliente { get; set; } public string CEPEndereco { get; set; } public string Numero { get; set; } public string Complemento { get; set; } public string PontoReferencia { get; set; } public virtual Cliente Cliente { get; set; } public virtual CEP CEP { get; set; } } public partial class CEP { public string CodCep { get; set; } public string Tipo_Logradouro { get; set; } public string Logradouro { get; set; } public string Bairro { get; set; } public int CodigoUF { get; set; } public int CodigoCidade { get; set; } public virtual Cidade Cidade { get; set; } } public partial class Cidade { public int CodigoCidade { get; set; } public string NomeCidade { get; set; } public int CodigoUF { get; set; } public virtual ICollection<CEP> CEPs { get; set; } public virtual UF UF { get; set; } public virtual ICollection<UF> UFs { get; set; } } public partial class UF { public int CodigoUF { get; set; } public string SiglaUF { get; set; } public string NomeUF { get; set; } public int CodigoCidadeCapital { get; set; } public virtual ICollection<Cidade> Cidades { get; set; } public virtual Cidade Cidade { get; set; } } var cliente = context.Clientes .Where(c => c.Email == email) .Where(c => c.Senha == senha) .Include(e => e.Enderecos) .Include(e1 => e1.Enderecos.Select(cep => cep.CEP)) .SingleOrDefault(); Thanks!

    Read the article

  • Why does calling IEnumerable<string>.Count() create an additional assembly dependency ?

    - by Gishu
    Assume this chain of dll references Tests.dll >> Automation.dll >> White.Core.dll with the following line of code in Tests.dll, where everything builds result.MissingPaths Now when I change this to result.MissingPaths.Count() I get the following build error for Tests.dll "White.UIItem is not defined in an assembly that is not referenced. You must add a reference to White.Core.dll." And I don't want to do that because it breaks my layering. Here is the type definition for result, which is in Automation.dll public class HasResult { public HasResult(IEnumerable<string> missingPaths ) { MissingPaths = missingPaths; } public IEnumerable<string> MissingPaths { get; set; } public bool AllExist { get { return !MissingPaths.Any(); } } } Down the call chain the input param to this ctor is created via (The TreeNode class is in White.Core.dll) assetPaths.Where(assetPath => !FindTreeNodeUsingCache(treeHandle, assetPath)); Why does this dependency leak when calling Count() on IEnumerable ? I then suspected that lazy evaluation was causing this (for some reason) - so I slotted in an ToArray() in the above line but didn't work. Update 2011 01 07: Curiouser and Curiouser! it won't build until I add a White.Core reference. So I add a reference and build it (in order to find the elusive dependency source). Open it up in Reflector and the only references listed are Automation, mscorlib, System.core and NUnit. So the compiler threw away the White reference as it was not needed. ILDASM also confirms that there is no White AssemblyRef entry. Any ideas on how to get to the bottom of this thing (primarily for 'now I wanna know why' reasons)? What are the chances that this is an VS2010/MSBuild bug? Update 2011 01 07 #2 As per Shimmy's suggestion, tried calling the method explcitly as an extension method Enumerable.Count(result.MissingPaths) and it stops cribbing (not sure why). However I moved some code around after that and now I'm getting the same issue at a different location using IEnumerable - this time reading and filtering lines out of a file on disk (totally unrelated to White). Seems like it's a 'symptom-fix'. var lines = File.ReadLines(aFilePath).ToArray(); once again, if I remove the ToArray() it compiles again - it seems that any method that causes the enumerable to be evaluated (ToArray, Count, ToList, etc.) causes this. Let me try and get a working tiny-app to demo this issue... Update 2011 01 07 #3 Phew! More information.. It turns out the problem is just in one source file - this file is LINQ-phobic. Any call to an Enumerable extension method has to be explicitly called out. The refactorings that I did caused a new method to be moved into this source file, which had some LINQ :) Still no clue as to why this class dislikes LINQ. using System; using System.Collections.Generic; using System.IO; using System.Linq; using G.S.OurAutomation.Constants; using G.S.OurAutomation.Framework; using NUnit.Framework; namespace G.S.AcceptanceTests { public abstract class ConfigureThingBase : OurTestFixture { .... private static IEnumerable<string> GetExpectedThingsFor(string param) { // even this won't compile - although it compiles fine in an adjoining source file in the same assembly //IEnumerable<string> s = new string[0]; //Console.WriteLine(s.Count()); // this is the line that is now causing a build failure // var expectedInfo = File.ReadLines(someCsvFilePath)) // .Where(line => !line.StartsWith("REM", StringComparison.InvariantCultureIgnoreCase)) // .Select(line => line.Replace("%PLACEHOLDER%", param)) // .ToArray(); // Unrolling the LINQ above removes the build error var expectedInfo = Enumerable.ToArray( Enumerable.Select( Enumerable.Where( File.ReadLines(someCsvFilePath)), line => !line.StartsWith("REM", StringComparison.InvariantCultureIgnoreCase)), line => line.Replace("%PLACEHOLDER%", param)));

    Read the article

  • GWT Deserialisation of Persistent Entities (JPA)

    - by slartidan
    Hi everyone, I am currently developing Java/GWT-application which is hosted on a weblogic application server. I am using EJB3.0 with EclipseLink as persistence layer. Sadly my GWT has problems to deserialize persistent entities. It might be helpful for you to know, that I have the EclipseLink-Library in my classpath (including javax.persistence.Entity) am not recieving the persistence objects from a database or persistence-manager - I am creating the objects with standard java code use Eclipse IDE for Java EE Developers for development and deploying and I am compiling my GWT code with the GWT-Plugin (GWT 2.1.0) - my source code is split up in several projects am pretty sure, that the problems occures on client side, since the HTTP response of my server is the same in my working and in my not working example tried to patch javax.persistence.Entity and tried to include several libraries which included javax.persistence.Entity but nothing was helping In my server provides a list of instances of class SerialClass; the interface looks like this: public interface GreetingService extends RemoteService { List<SerialClass> greetServer(); } My onModuleLoad()-Method gets those instances and creates a browser-popup with the information: public void onModuleLoad() { GreetingServiceAsync server = (GreetingServiceAsync) GWT.create(GreetingService.class); server.greetServer(new AsyncCallback<List<SerialClass>>() { public void onFailure(Throwable caught) { } public void onSuccess(List<SerialClass> result) { String resultString = ""; try { for (SerialClass serial : result) { if (serial == null) { resultString += "null "; } else { resultString += ">" + serial.id + "< "; } } } catch (Throwable t) { Window.alert("failed to process"); } Window.alert("success:" + resultString); } }); } My server is looking like this: public class GreetingServiceImpl extends RemoteServiceServlet implements GreetingService { public List<SerialClass> greetServer() throws IllegalArgumentException { List<SerialClass> list = new ArrayList<SerialClass>(); for (int i = 0; i < 100; i++) { list.add(new SerialClass()); } return list; } } Case 1 = everything works fine I am using this SerialClass (either without any annotation, or with any annotation other than Entity - for example javax.persistence.PersistenceContext works fine): //@Entity public class SerialClass implements Serializable, IsSerializable { public int id = 4711; } The popup contains (as expected): success:>4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< >4711< The data sent over HTTP looks like this: //OK[4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,100,1,["java.util.ArrayList/3821976829","serial.shared.SerialClass/10650133"],0,6] Case 2 = its not working at all I am using this SerialClass: @Entity public class SerialClass implements Serializable, IsSerializable { public int id = 4711; } My popup contains (THIS IS MY PROBLEM): success:>2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null >2< null The data sent over HTTP looks like this (exactly the same!): //OK[4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,4711,2,100,1,["java.util.ArrayList/3821976829","serial.shared.SerialClass/10650133"],0,6] There is no suspicious logging output - neither on server, nor on client. All HTTP-responses have return code 200. My current workaround I am going to try to create transfer objects as a copy of my SerialClass - those transfer objects will look exactly the same, but will not have the @Entity annotation. Alternatively I could try to use the RequestFactory (thanks to @Hilbrand for the hint). I really don't know how to solve that problem and I'm really thankful about any suggestions, hints, tips, links, etc.

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >