Search Results

Search found 74846 results on 2994 pages for 'business intelligence innovation adaptive data model insurance data warehouse'.

Page 59/2994 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • are there any useful datasets available on the web for data mining?

    - by niko
    Hi, Does anyone know any good resource where example (real) data can be downloaded for experimenting statistics and machine learning techniques such as decision trees etc? Currently I am studying machine learning techniques and it would be very helpful to have real data for evaluating the accuracy of various tools. If anyone knows any good resource (perhaps csv, xls files or any other format) I would be very thankful for a suggestion.

    Read the article

  • Build model with nested model in rspec integration test

    - by user1116573
    I understand that I can do something like in rspec: let(:project) { Project.new } but in my app a project accepts_nested_attributes_for tasks and when I generate the Project form I build a task along with it using: @project = Project.new @project.tasks.build I need something like: let(:project) { Project.new.tasks.build } but that doesn't seem to work. How can I do this as a let in my rspec test?

    Read the article

  • OSI Model

    - by kaleidoscope
    The Open System Interconnection Reference Model (OSI Reference Model or OSI Model) is an abstract description for layered communications and computer network protocol design. In its most basic form, it divides network architecture into seven layers which, from top to bottom, are the Application, Presentation, Session, Transport, Network, Data Link, and Physical Layers. It is therefore often referred to as the OSI Seven Layer Model. A layer is a collection of conceptually similar functions that provide services to the layer above it and receives service from the layer below it. Description of OSI layers: Layer 1: Physical Layer ·         Defines the electrical and physical specifications for devices. In particular, it defines the relationship between a device and a physical medium. ·         Establishment and termination of a connection to a communications medium. ·         Participation in the process whereby the communication resources are effectively shared among multiple users. ·         Modulation or conversion between the representation of digital data in user equipment and the corresponding signals transmitted over a communications channel. Layer 2: Data Link Layer ·         Provides the functional and procedural means to transfer data between network entities. ·         Detect and possibly correct errors that may occur in the Physical Layer. The error check is performed using Frame Check Sequence (FCS). ·         Addresses is then sought to see if it needs to process the rest of the frame itself or whether to pass it on to another host. ·         The Layer is divided into two sub layers: The Media Access Control (MAC) layer and the Logical Link Control (LLC) layer. ·         MAC sub layer controls how a computer on the network gains access to the data and permission to transmit it. ·         LLC layer controls frame synchronization, flow control and error checking.   Layer 3: Network Layer ·         Provides the functional and procedural means of transferring variable length data sequences from a source to a destination via one or more networks. ·         Performs network routing functions, and might also perform fragmentation and reassembly, and report delivery errors. ·         Network Layer Routers operate at this layer—sending data throughout the extended network and making the Internet possible.   Layer 4: Transport Layer ·         Provides transparent transfer of data between end users, providing reliable data transfer services to the upper layers. ·         Controls the reliability of a given link through flow control, segmentation/de-segmentation, and error control. ·         Transport Layer can keep track of the segments and retransmit those that fail. Layer 5: Session Layer ·         Controls the dialogues (connections) between computers. ·         Establishes, manages and terminates the connections between the local and remote application. ·         Provides for full-duplex, half-duplex, or simplex operation, and establishes checkpointing, adjournment, termination, and restart procedures. ·         Implemented explicitly in application environments that use remote procedure calls. Layer 6: Presentation Layer ·         Establishes a context between Application Layer entities, in which the higher-layer entities can use different syntax and semantics, as long as the presentation service understands both and the mapping between them. The presentation service data units are then encapsulated into Session Protocol data units, and moved down the stack. ·         Provides independence from differences in data representation (e.g., encryption) by translating from application to network format, and vice versa. The presentation layer works to transform data into the form that the application layer can accept. This layer formats and encrypts data to be sent across a network, providing freedom from compatibility problems. It is sometimes called the syntax layer. Layer 7: Application Layer ·         This layer interacts with software applications that implement a communicating component. ·         Identifies communication partners, determines resource availability, and synchronizes communication. o       When identifying communication partners, the application layer determines the identity and availability of communication partners for an application with data to transmit. o       When determining resource availability, the application layer must decide whether sufficient network or the requested communication exists. o       In synchronizing communication, all communication between applications requires cooperation that is managed by the application layer. Technorati Tags: Kunal,OSI,Networking

    Read the article

  • Add note model in Rails

    - by dannymcc
    Hi Everyone, I am following the 15 minute blog tutorial on Ruby on Rails .com: http://media.rubyonrails.org/video/rails_blog_2.mov and am stumbling into some issues. I am using the following alterations to the names in the tutorial: posts = kases comments = notes I have setup the models as follows: class Kase < ActiveRecord::Base validates_presence_of :jobno has_many :notes belongs_to :company # foreign key: company_id belongs_to :person # foreign key in join table belongs_to :surveyor, :class_name => "Company", :foreign_key => "appointedsurveyor_id" belongs_to :surveyorperson, :class_name => "Person", :foreign_key => "surveyorperson_id" def to_param jobno end and... class Note < ActiveRecord::Base belongs_to :kase end The Notes controller look like this: # POST /notes # POST /notes.xml def create @kase = Kase.find(params[:kase_id]) @note = @kase.notes.build(params[:note]) redirect_to @kase end and the database scheme for Kases looks like this: create_table "notes", :force => true do |t| t.integer "kase_id" t.text "body" t.datetime "created_at" t.datetime "updated_at" end and for kases... create_table "kases", :force => true do |t| t.string "jobno" t.date "dateinstructed" t.string "clientref" t.string "clientcompanyname" t.text "clientcompanyaddress" t.string "clientcompanyfax" t.string "casehandlername" t.string "casehandlertel" t.string "casehandleremail" t.text "casesubject" t.string "transport" t.string "goods" t.string "claimantname" t.string "claimantaddressline1" t.string "claimantaddressline2" t.string "claimantaddressline3" t.string "claimantaddresscity" t.string "claimantaddresspostcode" t.string "claimantcontact" t.string "claimanttel" t.string "claimantmob" t.string "claimantemail" t.string "claimanturl" t.string "lyingatlocationname" t.string "lyingatlocationaddressline1" t.string "lyingatlocationaddressline2" t.string "lyingatlocationaddressline3" t.string "lyingatlocationaddresscity" t.string "lyingatlocationaddresspostcode" t.string "lyingatlocationcontactname" t.string "lyingattel" t.string "lyingatmobile" t.string "lyingatlocationurl" t.text "comments" t.string "invoicenumber" t.string "netamount" t.string "vat" t.string "grossamount" t.date "dateclosed" t.date "datepaid" t.datetime "filecreated" t.string "avatar_file_name" t.string "avatar_content_type" t.integer "avatar_file_size" t.datetime "avatar_updated_at" t.datetime "created_at" t.datetime "updated_at" t.string "kase_status" t.string "invoice_date" t.integer "surveyorperson_id" t.integer "appointedsurveyor_id" t.integer "person_id" t.string "company_id" t.string "dischargeamount" t.string "dishchargeheader" t.text "highrisesubject" end Whenever I enter a note into the kase show view's note entry form: <h2>Notes</h2> <div id="sub-notes"> <%= render :partial => @kase.notes %> </div> <% form_for [@kase, Note.new] do |f| %> <p> <%= f.label :body, "New Note" %><br /> <%= f.text_area :body %> </p> <p><%= f.submit "Add Note" %></p> <% end %> partial: <% div_for note do %> <p> <strong>Created <%= time_ago_in_words(note.created_at) %> ago</strong><br /> <%= h(note.body) %> </p> <% end %> I get the following error: ActiveRecord::RecordNotFound in NotesController#create Couldn't find Kase with ID=Test Case I have tried removing the def to_param jobno end from the kase model, but the same error shows. Any ideas what I'm missing? Thanks, Danny

    Read the article

  • Big Data: Size isn’t everything

    - by Simon Elliston Ball
    Big Data has a big problem; it’s the word “Big”. These days, a quick Google search will uncover terabytes of negative opinion about the futility of relying on huge volumes of data to produce magical, meaningful insight. There are also many clichéd but correct assertions about the difficulties of correlation versus causation, in massive data sets. In reading some of these pieces, I begin to understand how climatologists must feel when people complain ironically about “global warming” during snowfall. Big Data has a name problem. There is a lot more to it than size. Shape, Speed, and…err…Veracity are also key elements (now I understand why Gartner and the gang went with V’s instead of S’s). The need to handle data of different shapes (Variety) is not new. Data developers have always had to mold strange-shaped data into our reporting systems, integrating with semi-structured sources, and even straying into full-text searching. However, what we lacked was an easy way to add semi-structured and unstructured data to our arsenal. New “Big Data” tools such as MongoDB, and other NoSQL (Not Only SQL) databases, or a graph database like Neo4J, fill this gap. Still, to many, they simply introduce noise to the clean signal that is their sensibly normalized data structures. What about speed (Velocity)? It’s not just high frequency trading that generates data faster than a single system can handle. Many other applications need to make trade-offs that traditional databases won’t, in order to cope with high data insert speeds, or to extract quickly the required information from data streams. Unfortunately, many people equate Big Data with the Hadoop platform, whose batch driven queries and job processing queues have little to do with “velocity”. StreamInsight, Esper and Tibco BusinessEvents are examples of Big Data tools designed to handle high-velocity data streams. Again, the name doesn’t do the discipline of Big Data any favors. Ultimately, though, does analyzing fast moving data produce insights as useful as the ones we get through a more considered approach, enabled by traditional BI? Finally, we have Veracity and Value. In many ways, these additions to the classic Volume, Velocity and Variety trio acknowledge the criticism that without high-quality data and genuinely valuable outputs then data, big or otherwise, is worthless. As a discipline, Big Data has recognized this, and data quality and cleaning tools are starting to appear to support it. Rather than simply decrying the irrelevance of Volume, we need as a profession to focus how to improve Veracity and Value. Perhaps we should just declare the ‘Big’ silent, embrace these new data tools and help develop better practices for their use, just as we did the good old RDBMS? What does Big Data mean to you? Which V gives your business the most pain, or the most value? Do you see these new tools as a useful addition to the BI toolbox, or are they just enabling a dangerous trend to find ghosts in the noise?

    Read the article

  • Chalk Talk with John: Business Value of Identity and Access Management

    - by John Brunswick
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Conveying the business value of Identity and Access Management to non technologists can potentially be challenging, especially considering the breadth capability supplied by these technologies. In this episode of Chalk Talk with John, Bob at Codeaway Valley asks Jim from Middleware Fields how they are able to manage access to buildings and facilities throughout their community. Bob and his team struggle to keep up with the needs of their community members, while ensuring the community’s safety. Jim shares his creative solution to simplifying the management of access throughout their community in Middleware Fields. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} About me: Hi, I am John Brunswick, an Oracle Enterprise Architect. As an Oracle Enterprise Architect, I focus on the alignment of technical capabilities in support of business vision and objectives, as well as the overall business value of technology.  Before coming to Oracle, I was a Practice Manager within BEA System's Business Interaction Division consulting organization, orchestrating enterprise systems in support of line of business goals. Follow me on Twitter and visit my site for Oracle Fusion Middleware related tips.

    Read the article

  • Know your Data Lineage

    - by Simon Elliston Ball
    An academic paper without the footnotes isn’t an academic paper. Journalists wouldn’t base a news article on facts that they can’t verify. So why would anyone publish reports without being able to say where the data has come from and be confident of its quality, in other words, without knowing its lineage. (sometimes referred to as ‘provenance’ or ‘pedigree’) The number and variety of data sources, both traditional and new, increases inexorably. Data comes clean or dirty, processed or raw, unimpeachable or entirely fabricated. On its journey to our report, from its source, the data can travel through a network of interconnected pipes, passing through numerous distinct systems, each managed by different people. At each point along the pipeline, it can be changed, filtered, aggregated and combined. When the data finally emerges, how can we be sure that it is right? How can we be certain that no part of the data collection was based on incorrect assumptions, that key data points haven’t been left out, or that the sources are good? Even when we’re using data science to give us an approximate or probable answer, we cannot have any confidence in the results without confidence in the data from which it came. You need to know what has been done to your data, where it came from, and who is responsible for each stage of the analysis. This information represents your data lineage; it is your stack-trace. If you’re an analyst, suspicious of a number, it tells you why the number is there and how it got there. If you’re a developer, working on a pipeline, it provides the context you need to track down the bug. If you’re a manager, or an auditor, it lets you know the right things are being done. Lineage tracking is part of good data governance. Most audit and lineage systems require you to buy into their whole structure. If you are using Hadoop for your data storage and processing, then tools like Falcon allow you to track lineage, as long as you are using Falcon to write and run the pipeline. It can mean learning a new way of running your jobs (or using some sort of proxy), and even a distinct way of writing your queries. Other Hadoop tools provide a lot of operational and audit information, spread throughout the many logs produced by Hive, Sqoop, MapReduce and all the various moving parts that make up the eco-system. To get a full picture of what’s going on in your Hadoop system you need to capture both Falcon lineage and the data-exhaust of other tools that Falcon can’t orchestrate. However, the problem is bigger even that that. Often, Hadoop is just one piece in a larger processing workflow. The next step of the challenge is how you bind together the lineage metadata describing what happened before and after Hadoop, where ‘after’ could be  a data analysis environment like R, an application, or even directly into an end-user tool such as Tableau or Excel. One possibility is to push as much as you can of your key analytics into Hadoop, but would you give up the power, and familiarity of your existing tools in return for a reliable way of tracking lineage? Lineage and auditing should work consistently, automatically and quietly, allowing users to access their data with any tool they require to use. The real solution, therefore, is to create a consistent method by which to bring lineage data from these data various disparate sources into the data analysis platform that you use, rather than being forced to use the tool that manages the pipeline for the lineage and a different tool for the data analysis. The key is to keep your logs, keep your audit data, from every source, bring them together and use the data analysis tools to trace the paths from raw data to the answer that data analysis provides.

    Read the article

  • Rails User-Profile model challenges

    - by Craig
    I am attempting to create an enrollment process similar to SO's: route to an OpenID provider provider returns the user's information to the UsersController (a guess) UsersController creates user, then routes to the ProfilesController's new or edit action. For now, I'm simply trying to create the user, then route to the ProfilesController's new or edit action (not sure which I should be using). Here's what I have thus far: Models: class User < ActiveRecord::Base has_one :profile end class Profile < ActiveRecord::Base belongs_to :user end Routes: map.resources :users do |user| user.resource :profile end new_user_profile GET /users/:user_id/profile/new(.:format) {:controller=>"profiles", :action=>"new"} edit_user_profile GET /users/:user_id/profile/edit(.:format) {:controller=>"profiles", :action=>"edit"} user_profile GET /users/:user_id/profile(.:format) {:controller=>"profiles", :action=>"show"} PUT /users/:user_id/profile(.:format) {:controller=>"profiles", :action=>"update"} DELETE /users/:user_id/profile(.:format) {:controller=>"profiles", :action=>"destroy"} POST /users/:user_id/profile(.:format) {:controller=>"profiles", :action=>"create"} users GET /users(.:format) {:controller=>"users", :action=>"index"} POST /users(.:format) {:controller=>"users", :action=>"create"} new_user GET /users/new(.:format) {:controller=>"users", :action=>"new"} edit_user GET /users/:id/edit(.:format) {:controller=>"users", :action=>"edit"} user GET /users/:id(.:format) {:controller=>"users", :action=>"show"} PUT /users/:id(.:format) {:controller=>"users", :action=>"update"} DELETE /users/:id(.:format) {:controller=>"users", :action=>"destroy"} Controllers: class UsersController < ApplicationController # generate new-user form def new @user = User.new end # process new-user-form post def create @user = User.new(params[:user]) if @user.save redirect_to new_user_profile_path(@user) ... end end # generate edit-user form def edit @user = User.find(params[:id]) end # process edit-user-form post def update @user = User.find(params[:id]) respond_to do |format| if @user.update_attributes(params[:user]) flash[:notice] = 'User was successfully updated.' format.html { redirect_to(users_path) } format.xml { head :ok } ... end end end class ProfilesController < ApplicationController before_filter :get_user def get_user @user = User.find(params[:user_id]) end # generate new-profile form def new @user.profile = Profile.new @profile = @user.profile end # process new-profile-form post def create @user.profile = Profile.new(params[:profile]) @profile = @user.profile respond_to do |format| if @profile.save flash[:notice] = 'Profile was successfully created.' format.html { redirect_to(@profile) } format.xml { render :xml => @profile, :status => :created, :location => @profile } ... end end end # generate edit-profile form def edit @profile = @user.profile end # generate edit-profile-form post def update @profile = @user.profile respond_to do |format| if @profile.update_attributes(params[:profile]) flash[:notice] = 'Profile was successfully updated.' # format.html { redirect_to(@profile) } format.html { redirect_to(user_profile(@user)) } format.xml { head :ok } else format.html { render :action => "edit" } format.xml { render :xml => @profile.errors, :status => :unprocessable_entity } end end end Edit-User View: ... <% form_for(@user) do |f| %> ... New-Profile View: ... <% form_for([@user,@profile]) do |f| %> .. I'm having two problems: When saving an edit to the User model, the UsersController attempts to route to http://localhost:3000/users/1/profile.%23%3Cprofile:0x10438e3e8%3E, instead of http://localhost:3000/users/1/profile When the new-profile form is being rendered, it throws an error that reads: undefined method `user_profiles_path' for # Is it better to create a blank profile when the user is created (in the UsersController), then edit it OR follow the rest-ful convention of creating the profile in the ProfilesController (as I have done)? What am I missing? I did review Associating Two Models in Rails (user and profile), but it didn't address my needs. Thanks for your time.

    Read the article

  • ai: Determining what tests to run to get most useful data

    - by Sai Emrys
    This is for http://cssfingerprint.com I have a system (see about page on site for details) where: I need to output a ranked list, with confidences, of categories that match a particular feature vector the binary feature vectors are a list of site IDs & whether this session detected a hit feature vectors are, for a given categorization, somewhat noisy (sites will decay out of history, and people will visit sites they don't normally visit) categories are a large, non-closed set (user IDs) my total feature space is approximately 50 million items (URLs) for any given test, I can only query approx. 0.2% of that space I can only make the decision of what to query, based on results so far, ~10-30 times, and must do so in <~100ms (though I can take much longer to do post-processing, relevant aggregation, etc) getting the AI's probability ranking of categories based on results so far is mildly expensive; ideally the decision will depend mostly on a few cheap sql queries I have training data that can say authoritatively that any two feature vectors are the same category but not that they are different (people sometimes forget their codes and use new ones, thereby making a new user id) I need an algorithm to determine what features (sites) are most likely to have a high ROI to query (i.e. to better discriminate between plausible-so-far categories [users], and to increase certainty that it's any given one). This needs to take into balance exploitation (test based on prior test data) and exploration (test stuff that's not been tested enough to find out how it performs). There's another question that deals with a priori ranking; this one is specifically about a posteriori ranking based on results gathered so far. Right now, I have little enough data that I can just always test everything that anyone else has ever gotten a hit for, but eventually that won't be the case, at which point this problem will need to be solved. I imagine that this is a fairly standard problem in AI - having a cheap heuristic for what expensive queries to make - but it wasn't covered in my AI class, so I don't actually know whether there's a standard answer. So, relevant reading that's not too math-heavy would be helpful, as well as suggestions for particular algorithms. What's a good way to approach this problem?

    Read the article

  • Using game of life or other virtual environment for artificial (intelligence) life simulation? [clos

    - by Berlin Brown
    One of my interests in AI focuses not so much on data but more on biologic computing. This includes neural networks, mapping the brain, cellular-automata, virtual life and environments. Described below is an exciting project that includes develop a virtual environment for bots to evolve in. "Polyworld is a cross-platform (Linux, Mac OS X) program written by Larry Yaeger to evolve Artificial Intelligence through natural selection and evolutionary algorithms." http://en.wikipedia.org/wiki/Polyworld " Polyworld is a promising project for studying virtual life but it still is far from creating an "intelligent autonomous" agent. Here is my question, in theory, what parameters would you use create an AI environment? Possibly a brain environment? Possibly multiple self contained life organisms that have their own "brain" or life structures. I would like a create a spin on the game of life simulation. What if you have a 64x64 game of life grid. But instead of one grid, you might have N number of grids. The N number of grids are your "life force" If all of the game of life entities die in a particular grid then that entire grid dies. A group of "grids" makes up a life form. I don't have an immediate goal. First, I want to simulate an environment and visualize what is going on in the environment with OpenGL and see if there are any interesting properties to the environment. I then want to add "scarce resources" and see if the AI environment can manage resources adequately.

    Read the article

  • NSURLConnection receives data even if no data was thrown back

    - by Anna Fortuna
    Let me explain my situation. Currently, I am experimenting long-polling using NSURLConnection. I found this and I decided to try it. What I do is send a request to the server with a timeout interval of 300 secs. (or 5 mins.) Here is a code snippet: NSURL *url = [NSURL URLWithString:urlString]; NSURLRequest *request = [NSURLRequest requestWithURL:url cachePolicy:NSURLCacheStorageAllowedInMemoryOnly timeoutInterval:300]; NSData *data = [NSURLConnection sendSynchronousRequest:request returningResponse:&resp error:&err]; Now I want to test if the connection will "hold" the request if no data was thrown back from the server, so what I did was this: if (data != nil) [self performSelectorOnMainThread:@selector(dataReceived:) withObject:data waitUntilDone:YES]; And the function dataReceived: looks like this: - (void)dataReceived:(NSData *)data { NSLog(@"DATA RECEIVED!"); NSString *string = [NSString stringWithUTF8String:[data bytes]]; NSLog(@"THE DATA: %@", string); } Server-side, I created a function that will return a data once it fits the arguments and returns none if nothing fits. Here is a snippet of the PHP function: function retrieveMessages($vardata) { if (!empty($vardata)) { $result = check_data($vardata) //check_data is the function which returns 1 if $vardata //fits the arguments, and 0 if it fails to fit if ($result == 1) { $jsonArray = array('Data' => $vardata); echo json_encode($jsonArray); } } } As you can see, the function will only return data if the $result is equal to 1. However, even if the function returns nothing, NSURLConnection will still perform the function dataReceived: meaning the NSURLConnection still receives data, albeit an empty one. So can anyone help me here? How will I perform long-polling using NSURLConnection? Basically, I want to maintain the connection as long as no data is returned. So how will I do it? NOTE: I am new to PHP, so if my code is wrong, please point it out so I can correct it.

    Read the article

  • How to clone a model's attributes easily?

    - by Zabba
    I have these models: class Address < ActiveRecord::Base belongs_to :event attr_accessible :street, :city validates :street, :city, :presence => true end class Event < ActiveRecord::Base has_one :address accepts_nested_attributes_for :address end If I do the below assignment in the Events create action and save the event I get an error: #Use the current user's address for the event @event.address_attributes = current_user.address.attributes #Error occurs at the above mentioned line ActiveRecord::RecordNotFound (Couldn't find Address with ID=1 for Event with ID=) I think what's happening is that all the address's attributes (including the primary key) is getting assigned in the @event.address_attributes = line. But all I really want is the "real data" (street, city), not the primary keys or created_at etc to get copied over. I suppose I could write a small method to do this sort of selective copy but I can't help but feel there must be some built-in method for this? What's the best/right way to do this?

    Read the article

  • How can I scrape specific data from a website

    - by Stoney
    I'm trying to scrape data from a website for research. The urls are nicely organized in an example.com/x format, with x as an ascending number and all of the pages are structured in the same way. I just need to grab certain headings and a few numbers which are always in the same locations. I'll then need to get this data into structured form for analysis in Excel. I have used wget before to download pages, but I can't figure out how to grab specific lines of text. Excel has a feature to grab data from the web (Data-From Web) but from what I can see it only allows me to download tables. Unfortunately, the data I need is not in tables.

    Read the article

  • How far to go with Domain Driven Design?

    - by synti
    I've read a little about domain driven design and the usage of a rich domain model, as described by Martin Fowler, and I've decided to put it in practice in a personal project, instead of using transaction scripts. Everything went fine until UI implementation started. The thing is some views will use rich components that are backed up by unusual models and, thus, I must transform the domain model into what is used by those components. And that transformation is specially "complex" in the view-to-domain portion, up to the point that some business logic is involved. Wich brings me to the questioning: where should I do these adaptations? So far I've got the following conclusions: Doing it in the presentation layer is good because, well, if that layer imposes restrictions in it's model, then it should be the one to handle them. But it's bad because there'll be some business leakage. If I do it on the services objects (controllers, actions, whatever), then it'd be good because there won't be any change to the domain API just because of presentation layer, but it's bad because then I'd have transaction scripts, wich is not the intended design. Finally, if I do it on the domain model, there'd be no leakage of business logic at all. But in the future I could expect an explosion of the API into a series of methods designed just to handle that view-model <- domain-model adaptation. I hope I could make myself clear on this.

    Read the article

  • NEW! Oracle Unified Business Process Management Specialization!

    - by michaela.seika(at)oracle.com
    Be one of the very first to become an Oracle Unified Business Process Management Specialist! Check out the Oracle Unified Business Process Management Knowledge Zone and go to the Specialization criteria to learn how you can become an BPM Specialized Partner. Pass the following assessment tests and exam to meet the competency criteria: · Oracle Unified Business Process Management Suite 11g Sales Specialist · Oracle Unified Business Process Management Suite 11g PreSales Specialist Assessment · Oracle Unified Business Process Management Suite 11g Essentials (1Z1-560) Exam Go to the OPN Competency Center to access the Specialist Guided Leaning Paths and Boot Camp to get the product information that can help you pass the tests: · Oracle Unified Business Process Management 11g Sales Specialist GLP · Oracle Unified Business Process Management 11g PreSales Specialist GLP · Oracle Unified Business Process Management 11g Implementation Specialist GLP · Oracle Unified Business Process Management Suite 11g Implementation Boot Camp Oracle Unified Business Process Management Suite 11g Essentials (1Z1-560) Exam is available in beta testing. Pass the exam to become an Oracle Unified Business Process Management Certified Implementation Specialist! As an incentive we are offering FREE beta exam vouchers to early-adopter Partners. As there are a limited number of free vouchers available, the requests will be processed on a first-come, first-served basis. To request a voucher send an e-mail to: [email protected] specifying the exam name, and your contact information: name, job role, and company name. Register for the OU beta exam at the nearest Pearson VUE testing center. For More Information Oracle Certification Program Beta Exams OPN Certified Specialist exams OPN Certified Specialist FAQ

    Read the article

  • How important is Domain knowledge vs. Technical knowledge?

    - by Mayank
    I am working on a Trading and Risk Management application and although from a C# background, I have been asked to work on SSIS packages. Now I can live with that. The pain point is that there is too much emphasis on business understanding. Trading (Energy Trading to be exact) is a HUGE area and understanding every little bit of it is overwhelming. But for the past two months I have been working on understanding the business terms - Mark To Market, Risk Metrics, Positions, PnL, Greeks, Instruments, Book Structure... every little detail (you get the point). Now IMHO, this is the job of a BA. Sure it is very important for developers to understand the business but where do you draw the line? When I talked to my manager about this, he almost mocked me by saying that anybody can learn a technology in a week. It's the business that's harder. My long term aspiration is to remain on the technical side, probably become an architect (if possible). If I wanted to focus so much on business I would have pursued an MBA! I want to know if I am wrong or too naive in understanding the business importance or is my frustration justified?

    Read the article

  • When to implement: Together with or after the source product?

    - by Jeremy Oosthuizen
    Somebody recently relayed a prospect's question to me: How hard would it be to implement OUBI after the source product (CC&B, WAM or NMS) has already been implemented? Fact is that MOST non-OUBI Data Warehouse / Business Intelligence implementations take place after the source application(s) are in place and hopefully stable. If an organization decides that they need better reporting and management information, then the logical path (see The Data Warehouse Institute's Data Warehouse Maturity Model) is to a Data Warehouse -- no matter when their last applications were implemented. If there is a pre-built Data Warehouse for their specific application, or even for the desired business process in their industry, they're in luck. Else they have to design and build from scratch, using a toolset. The implementation of a toolset is unlike the implementation of OUBI which, like OBI Apps, contain pre-built ETL routines and user content. Much has been written before about the advantages of that. So, because OUBI is designed specifically for Oracle Utilities transactional products, we often implement them in parallel -- with OUBI lagging a little behind by necessity, like Reporting. Customers know from the start they're going to need the solution, and therefore purchase the products at the same time. My biggest argument FOR a parallel installation/implementation of OUBI with the source product is two-fold: - There could be things (which is the technical term for data elements) that customers figure out they need when implementing OUBI, which are often easier added to the source product's implementation project, than to add later; - OUBI's ETL often points out errors (severe or not) with converted data, which are easier to fix during the source product's implementation project, or it may even be impossible to fix afterwards. The Conversion routines sometimes miss these errors, because the source system can live with the not-quite-perfect converted data. If the data can't be properly extracted, i.e. the proper Dimensions linked to the Facts, then it can't get into OUBI. That means it can't be analyzed effectively along with the rest of the organization's data. Then there is also the throw-away-work argument, which may be significant. The operational / transactional system cannot go live without reports on Day 1. A lot of those reports would be taken care of by the implementation of OUBI. If OUBI is implemented after go-live, those reports STILL have to be built during the source product's implementation project, but they become throw-away after the OUBI implementation. I have sometimes been told that it is better to implement OUBI after the source product, because it cuts down on scope and risk for the source product's implementation project. All I can say to that, is bah humbug. No, seriously, given the arguments above, planning has to include the OUBI implementation and it has to be managed properly -- just like any other implementation. If so, it should not add any risk and it should be included in the scope from the start. The answer to the prospect's question is therefore that it is not that much more difficult; after all, most DW/BI implemenations are done like that. They just have to consider the points above.

    Read the article

  • CoreData is saving model without the save: method being called

    - by chris
    I have a list of items with a plus button in the navigation bar that opens up a modal window containing a table of the models attributes, that displays a form when the table items are clicked (pretty standard form style). For some reason if I click the plus button to open the form to create a new model, then immediately click the done button, the person model is saved. The action linked to the done button does nothing but call on a delegate method notifying the personListViewController to close the window. The apple docs do state that the model is not saved after calling the insertNewObjectForEntityName: ... Simply creating a managed object does not cause it to be saved to a persistent store. The managed object context acts as a scratchpad. I am at a lost to why this is happening, but every time I click the done button I have a new blank item in the original tableView. I am using SDK v3.1.3 // PersonListViewController - open modal window - (void)addPerson { // Load the new form PersonNewViewController *newController = [[PersonNewViewController alloc] init]; newController.modalFormDelegate = self; [self presentModalViewController:newController animated:YES]; [newController release]; } // PersonFormViewController - (void)viewDidLoad { [super viewDidLoad]; if ( person == nil ) { MyAppDelegate *appDelegate = [[UIApplication sharedApplication] delegate]; self.person = [NSEntityDescription insertNewObjectForEntityForName:@"Person" inManagedObjectContext:appDelegate.managedObjectContext]; } ... UIBarButtonItem *doneButton = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemDone target:self action:@selector(done)]; self.navigationItem.rightBarButtonItem = doneButton; [doneButton release]; } - (IBAction) done { [self.modalFormDelegate didCancel]; } // delegate method in the original listViewController - (void) didCancel { [self dismissModalViewControllerAnimated:YES]; }

    Read the article

  • Magento, NGINX, PHP-FPM, APC, MEMCACHED, 16gb Ram CentOS, Spiking PHP-FPM to 100% CPU

    - by Terry Dunford
    I have been trying to resolve my issue of spiking cpu caused by php-fpm processes. I've reduced the php-fpm config settings to: pm = ondemand pm.max_children = 12 pm.start_servers = 2 pm.min_spare_servers = 2 pm.max_spare_servers = 10 pm.max_requests = 500 php_admin_value[memory_limit] = 128M Problem still exists. I'm running a Joomla main site (which is having no problems) and a Magento store in a sub-directory. My server is a Linux CentOS, running NGINX, APC, Memcached, Full Page Cache and php-fpm. My server has 8 cores and 16gb dedicated ram. My host has shut down my server several times the past week because my php-fpm processes are consuming the entire network. A lot of the individual php-fpm processes are getting over 50% cpu. I've hired several "professionals" and none of them was able to help me, so now broke and stumped, I'm turning to you guys for help. So any suggestions would be greatly appreciated. I turned on slow php logs and here are some of the latest results: [01-Apr-2012 14:26:12] [pool magento] pid 21537 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a394f8] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Varien/Db/Select.php:397 [0x0000000011a39158] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:705 [0x0000000011a38f30] assemble() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:1343 [0x00007fffbb6d6e50] __toString() unknown:0 [0x0000000011a38630] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:409 [0x0000000011a38270] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:388 [0x0000000011a38008] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a375c8] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:196 [0x0000000011a370e0] _loadLabels() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:129 [0x0000000011a369a0] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a364a8] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35968] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35590] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a35410] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35098] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a34fa8] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33998] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a331f0] +++ dump failed [01-Apr-2012 14:26:44] [pool magento] pid 21531 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a37768] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:251 [0x0000000011a37280] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x0000000011a36b40] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a36648] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35b08] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35730] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a355b0] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35238] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a35148] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33b38] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a33390] +++ dump failed [01-Apr-2012 14:27:01] [pool magento] pid 21528 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011ff67a8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011ff6518] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011ff5e90] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011ff5a20] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011ff5438] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011ff5078] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011ff4e98] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011ff4948] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1161 [0x0000000011ff4678] getProductCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:801 [0x0000000011ff33e0] getProductCount() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:54 [0x0000000011ff2da0] _initItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:23 [0x0000000011ff2818] _getItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:119 [0x0000000011ff26b0] _initItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:120 [0x0000000011ff2598] getItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:109 [0x0000000011ff2480] getItemsCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Layer/Filter/Abstract.php:126 [0x0000000011ff22b8] getItemsCount() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:218 [0x0000000011ff2088] canShowOptions() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:233 [0x0000000011ff14f8] canShowBlock() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/extendware/ewlayerednav/catalog/layer/view.phtml:6 [0x0000000011ff0d50] +++ dump failed [01-Apr-2012 14:27:04] [pool magento] pid 21529 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000012468ff8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000012468d68] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x00000000124686e0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000012468270] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000012467c88] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x00000000124678c8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000012467660] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000012467248] fetchAll() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:687 [0x00000000124668f0] _fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:1045 [0x0000000012466288] _loadEntities() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:869 [0x0000000012465fb0] load() /home/flyfish/www/flyshop/app/code/core/Mage/Review/Model/Observer.php:78 [0x0000000012465d10] catalogBlockProductCollectionBeforeToHtml() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1303 [0x0000000012464c28] _callObserverMethod() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1278 [0x00000000124649e0] dispatchEvent() /home/flyfish/www/flyshop/app/Mage.php:416 [0x0000000012464290] dispatchEvent() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Product/List.php:163 [0x0000000012463760] _beforeToHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:864 [0x00000000124633b0] toHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:584 [0x0000000012462e30] _getChildHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:528 [0x0000000012462d38] getChildHtml() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Category/View/6362e7526f5dcb27e7f8b0b414b59004.php:85 [0x00000000124629f0] getProductListHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Block/Override/Mage/Catalog/Category/View.php:20 [01-Apr-2012 14:27:55] [pool magento] pid 21536 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35010] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a34d80] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a346f8] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a34288] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a33ca0] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a338e0] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a33700] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011a33368] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Type.php:71 [0x0000000011a33238] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:483 [0x0000000011a32be8] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:500 [0x0000000011a32860] _afterLoad() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:108 [0x0000000011a32330] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Attribute/Abstract.php:118 [0x0000000011a31350] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Config.php:423 [0x0000000011a30ce8] getAttribute() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Helper/Output.php:156 [0x0000000011a30208] categoryAttribute() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/catalog/category/view.phtml:47 [0x0000000011a2fa60] +++ dump failed [01-Apr-2012 14:27:56] [pool magento] pid 21530 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35b10] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:276 [0x0000000011a35750] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:326 [0x0000000011a351f0] getSkinBaseUrl() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:482 [0x0000000011a350a8] getSkinUrl() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:981 [0x0000000011a32468] getSkinUrl() /home/flyfish/www/flyshop/app/code/local/Extendware/EWMinify/Block/Override/Mage/Page/Html/Head.php:126 [0x0000000011a30ca8] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWCore/Block/Override/Mage/Page/Html/Head.php:55 [0x0000000011a30978] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/MageWorx/SeoSuite/Block/Page/Html/Head.php:41 [0x0000000011a2fd10] getCssJsHtml() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-modalheader/rokmage-head.phtml:26 [0x0000000011a2f568] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21527 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000010c7bba0] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000010c7b910] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000010c7b288] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000010c7ae18] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000010c7a830] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000010c7a470] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000010c7a168] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:808 [0x0000000010c79558] fetchPairs() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Collection.php:840 [0x0000000010c79240] addCountToCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:133 [0x0000000010c71d48] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000010c715a0] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21577 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a3a8d8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a3a648] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a39fc0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a39b50] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a39568] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a391a8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a38f40] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a37cc0] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:276 [0x0000000011a37b20] _loadNodes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1229 [0x0000000011a379a0] getChildrenCategories() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:841 [0x0000000011a37690] getChildrenCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:130 [0x0000000011a30198] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000011a2f9f0] +++ dump failed [01-Apr-2012 14:28:48] [pool magento] pid 21629 script_filename = /home/flyfish/www/flyshop/index.php [0x00002ac987e2cb48] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:252 [0x00002ac987e2c660] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x00002ac987e2bf20] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x00002ac987e2ba28] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x00002ac987e2aee8] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x00002ac987e2ab10] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x00002ac987e2a990] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x00002ac987e2a618] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x00002ac987e2a528] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x00002ac987e28f18] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x00002ac987e28770] +++ dump failed ___________________________________________ A snippet of the Latest php-fpm error log: [01-Apr-2012 14:26:12] WARNING: [pool magento] child 21537, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.265105 sec), logging [01-Apr-2012 14:26:12] ERROR: failed to ptrace(PEEKDATA) pid 21537: Input/output error (5) [01-Apr-2012 14:26:44] WARNING: [pool magento] child 21531, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.268434 sec), logging [01-Apr-2012 14:26:44] ERROR: failed to ptrace(PEEKDATA) pid 21531: Input/output error (5) [01-Apr-2012 14:27:01] WARNING: [pool magento] child 21528, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (6.656633 sec), logging [01-Apr-2012 14:27:01] ERROR: failed to ptrace(PEEKDATA) pid 21528: Input/output error (5) [01-Apr-2012 14:27:04] WARNING: [pool magento] child 21529, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.211136 sec), logging [01-Apr-2012 14:27:55] WARNING: [pool magento] child 21536, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.207001 sec), logging [01-Apr-2012 14:27:55] ERROR: failed to ptrace(PEEKDATA) pid 21536: Input/output error (5) [01-Apr-2012 14:27:56] WARNING: [pool magento] child 21530, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.503186 sec), logging [01-Apr-2012 14:27:56] ERROR: failed to ptrace(PEEKDATA) pid 21530: Input/output error (5) [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21577, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.722625 sec), logging [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21527, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.122326 sec), logging [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21527: Input/output error (5) [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21577: Input/output error (5) [01-Apr-2012 14:28:48] WARNING: [pool magento] child 21629, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.446961 sec), logging [01-Apr-2012 14:28:48] ERROR: failed to ptrace(PEEKDATA) pid 21629: Input/output error (5) _____________________________________________ I also noticed that the server is not using much memory: Mem: 16777216k total, 1204040k used, 15573176k free My.conf settings: query_cache_size = 128M innodb_buffer_pool_size = 512M open-files-limit = 8192 table_cache=4096 I just noticed that someone changed my innodb_buffer_pool_size to 512M. Shouldn't this be set to 80% of available ram? So I have 16gb ram so it should be set at 12G; however, I set it at 10G. What do you think? I made that change and restart everything. Php-fpm is still spiking cpu. Here is just 1 php-fpm process: 23942 user 17 0 507m 99m 27m R 90.9%CPU 0.6 0:03.46 php-fpm I'm sure there may be more information you will need to help, so just let me know what you guys need to help me figure this out. Thank you.

    Read the article

  • best way to reference business objects from presentation layer..?

    - by Vytas999
    I want to develop an enterprise app that includes a WindowsForms presentation layer, middle-tier components for business logic and data access, and a MsSQL Server database. Middle-tier components should contain some business objects and will be called from presentation layer using .NET Remoting. Whitch is the best way (and why) to reference these business objects from presentation layer? a) Create class library project, implementing business objects. Reference this project from presentation layer and middle-tier layer. b) Create interface library project defining business objects. Create class library project implementing interfaces. Reference class library project from middle-tier layer. Reference interface library project from presentation layer. c) Create separate class library projects for middle-tier and presentation layer. Reference corresponding project from presentation layer.

    Read the article

  • WCF DataContracts and underlying data structures

    - by Xerx
    I am wondering what makes sense in relation to what objects to expose through a WCF service - should I add WCF Serialization specifications to my business entities or should I implement a converter that maps my business entities to the DataContracts that I want to expose through my WCF service? Right now I have entities on different levels: DataAccess, Business and Contract. I have converters in place that can map entities from DataAccess to Business and from Business to Contract and vice versa. Implementing and Maintaining those is time consuming and pretty tedious. What are best practices in relation to this? If I were using an OR/M such as NHibernate or Entity Framework should I be exposing the entities from the ORM directly or should I abstract them the same way I am doing now? Thanks in advance.

    Read the article

  • Rhapsody TestConductor Experiences

    - by vaiomike
    I was wondering whether anybody out there is actively using Rhapsody TestConductor? Or has tried it for a while, but then decided to turn it down for a particular reason? If so, what are your experiences, in which field do you apply it, what are the shortcomings, or why did you turn it down? At the moment we're considering TestConductor as our tool of choice for testing as it's already integrated into Rhapsody, and would like to find out how applicable it is to our project (btw, we're using Rhapsody 7.4 in C). P.S: Recommendations on good books about Model Based Testing are also appreciated.

    Read the article

  • How do I set default values on new properties for existing entities after light weight core data migration?

    - by Moritz
    I've successfully completed light weight migration on my core data model. My custom entity Vehicle received a new property 'tirePressure' which is an optional property of type double with the default value 0.00. When 'old' Vehicles are fetched from the store (Vehicles that were created before the migration took place) the value for their 'tirePressure' property is nil. (Is that expected behavior?) So I thought: "No problem, I'll just do this in the Vehicle class:" - (void)awakeFromFetch { [super awakeFromFetch]; if (nil == self.tirePressure) { [self willChangeValueForKey:@"tirePressure"]; self.tirePressure = [NSNumber numberWithDouble:0.0]; [self didChangeValueForKey:@"tirePressure"]; } } Since "change processing is explicitly disabled around" awakeFromFetch I thought the calls to willChangeValueForKey and didChangeValueForKey would mark 'tirePresure' as dirty. But they don't. Every time these Vehicles are fetched from the store 'tirePressure' continues to be nil despite having saved the context.

    Read the article

  • Business Objects Web Intelligence-style reporting in a Winform app. Is this possible?

    - by George
    I have a WinForm app that displays results in a Gridview control. If a user right mouse clicks on a row, he can then, from a popup menu, select a command to perform on the row much like windows File Explorer. But now I want to be able to give the user the ability to construct a filter so that he can control which rows are displayed in the grid w/o effecting the functionality of the application. I would also like the user to be able to select the columns/fields that he sees in the grid. For the basis of this question, let's assume that the data displayed in grid comes from a single table. business Objects' Web Intelligence and Desktop Intelligence applications give me very flexible and powerful reporting capabilities, but I want to integrate this capability into my WinForm application. Does Business Objects, or maybe Crystal Reports provide this sort of functionality? I can construct my own query builder but I'd rather not reinvent sliced bread.

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >