Search Results

Search found 58436 results on 2338 pages for 'data'.

Page 459/2338 | < Previous Page | 455 456 457 458 459 460 461 462 463 464 465 466  | Next Page >

  • How to work with file and streams in php,case: if we open file in Class A and pass open stream to Cl

    - by Rachel
    I have two class, one is Business Logic Class{BLO} and the other one is Data Access Class{DAO} and I have dependency in my BLO class to my Dao class. Basically am opening a csv file to write into it in my BLO class using inside its constructor as I am creating an object of BLO and passing in file from command prompt: Code: $this->fin = fopen($file,'w+') or die('Cannot open file'); Now inside BLO I have one function notifiy, which call has dependency to DAO class and call getCurrentDBSnapshot function from the Dao and passes the open stream so that data gets populated into the stream. Code: Blo Class Constructor: public function __construct($file) { //Open Unica File for parsing. $this->fin = fopen($file,'w+') or die('Cannot open file'); // Initialize the Repository DAO. $this->dao = new Dao('REPO'); } Blo Class method that interacts with Dao Method and call getCurrentDBSnapshot. public function notifiy() { $data = $this->fin; var_dump($data); //resource(9) of type (stream) $outputFile=$this->dao->getCurrentDBSnapshot($data); // So basically am passing in $data which is resource((9) of type (stream) } Dao function: getCurrentDBSnapshot which get current state of Database table. public function getCurrentDBSnapshot($data) { $query = "SELECT * FROM myTable"; //Basically just preparing the query. $stmt = $this->connection->prepare($query); // Execute the statement $stmt->execute(); $header = array(); while ($row=$stmt->fetch(PDO::FETCH_ASSOC)) { if(empty($header)) { // Get the header values from table(columnnames) $header = array_keys($row); // Populate csv file with header information from the mytable fputcsv($data, $header); } // Export every row to a file fputcsv($data, $row); } var_dump($data);//resource(9) of type (stream) return $data; } So basically in am getting back resource(9) of type (stream) from getCurrentDBSnapshot and am storing that stream into $outputFile in Blo class method notify. Now I want to close the file which I opened for writing and so it should be fclose of $outputFile or $data, because in both the cases it gives me: var_dump(fclose($outputFile)) as bool(false) var_dump(fclose($data)) as bool(false) and var_dump($outputFile) as resource(9) of type (Unknown) var_dump($data) as resource(9) of type (Unknown) My question is that if I open file using fopen in class A and if I call class B method from Class A method and pass an open stream, in our case $data, than Class B would perform some work and return back and open stream and so How can I close that open stream in Class A's method or it is ok to keep that stream open and not use fclose ? Would appreciate inputs as am not very sure as how this can be implemented.

    Read the article

  • what is the fastest way to copy all data to a new larger hard drive?

    - by SUPER user
    I was certain this would have been covered before, but I cannot find an answer amongst all the almost-duplicates that come up; sorry if I've missed something obvious. I have a full 320gb disk inside my machine, a new 1tb disk to replace it, and a USB 2.0 chassis. It is only data on a single partition, no OS/apps involved, and the old drive will be kept somewhere as backup (no secure wiping etc). The simple option would be to put new disk in USB chassis, copy files, then swap them over. But for USB pen drives, reading is around 4x faster than writing. If the same is true for a USB SATA chassis (is it?) then it would be significantly faster to swap the drives first and read from the old drive over USB, right? Then the other consideration is that copying lots of files is usually slower than a single file of equivalent size. Is Windows 7 smart enough to do everything in a single lump like that, or is there specialised software that should be used instead? (Even if SATA-SATA copying is faster than involving USB, knowing what to do when it isn't an option is useful information.) Summary: Does a USB SATA chassis suffer from a read/write inequality? (like a USB pen drive does, but unlike a direct SATA connection) Can Windows 7 do sequential access? (I can't find confirmation if Robocopy does this.) Or is it necessary to use a bootable CD/USB with something like Clonezilla to achieve sequential copy speeds?

    Read the article

  • How can I calculate data for a boxplot (quartiles, median) in a Ralis app on Heroku? ( Heroku uses P

    - by hadees
    I'm trying to calculate the data needed to generate a box plot which means I need to figure out the 1st and 3rd Quartiles along with the median. I have found some solutions for doing it in Postgresql however they seem to depend on either PL/Python or PL/R which it seems like Heroku does not have either enabled for their postgresql databases. In fact I ran "select lanname from pg_language;" and only got back "internal". I also found some code to do it in pure ruby but that seems somewhat inefficient to me. I'm rather new to Box Plots, Postgresql, and Ruby on Rails so I'm open to suggestions on how I should handle this. There is a possibility to have a lot of data which is why I'm concerned with performance however if the solution ends up being too complex I may just do it in ruby and if my application gets big enough to warrant it get my own Postgresql I can host somewhere else. *note: since I was only able to post one link, cause I'm new, I decided to share a pastie with some relevant information

    Read the article

  • What stage of normalization is this? (moving repeating data into separate table)

    - by Sergio
    Hi There, I have noticed that when designing a database I tend to shift any repeating sets of data into a separate table. For example, say I had a table of people, with each person living in a state. I would then move these repeating states into a separate table and reference them with foreign keys. However, what if I was not storing any more data about states. I would then have a table with StateID and State in. Is this action correct? State is dependant on the primary key of the users table, so does shifting it into its own table help with anything? Thanks,

    Read the article

  • Bitmap to Texture2D problem with colors

    - by xnaNewbie89
    I have a small problem with converting a bitmap to a Texture2D. The resulted image of the conversion has the red channel switched with the blue channel :/ I don't know why, because the pixel formats are the same. If someone can help me I will be very happy :) System.Drawing.Image image = System.Drawing.Bitmap.FromFile(ImageFileLoader.filename); System.Drawing.Bitmap bitmap = new System.Drawing.Bitmap(image); Texture2D mapTexture = new Texture2D(Screen.Game.GraphicsDevice, bitmap.Width, bitmap.Height,false,SurfaceFormat.Color); System.Drawing.Imaging.BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle( 0, 0, bitmap.Width, bitmap.Height), System.Drawing.Imaging.ImageLockMode.ReadOnly,System.Drawing.Imaging.PixelFormat.Format32bppArgb); byte[] bytes = new byte[data.Height * data.Width*4]; System.Runtime.InteropServices.Marshal.Copy(data.Scan0, bytes, 0, bytes.Length); mapTexture.SetData<byte>(bytes, 0, data.Height * data.Width * 4); bitmap.UnlockBits(data); bitmap.Dispose(); image.Dispose();

    Read the article

  • How can I change the default location/action of 'Open Outlook Data File' in Outlook 2010?

    - by Chadddada
    I have recently deployed a Remote Desktop Host server that functions as a remote Microsoft Office 2010 work space for users. In part of the locking down of this server I have installed all programs on the D: drive and, through the use of Group Policy, hidden all the drives on the server from standard users. In addition to hiding these drives I am not allowing users to save anything locally (on the server) or open Libraries. However one of the functions of the server is to provide the Outlook client. Often users will have the .PST file stored on a network location and want to open this in Outlook. Can I change the default action or location that File Open Open Outlook Data File looks or tries to pull the file from? The default location seems to be under Users / Libraries. When click 'Open' you get a warning: This operation has been cancelled due to restrictions in effect on this computer. Clicking OK drops the user into a small menu that shows attached network drives under Computer. Can I instead have the 'Open' click drop the users in a defined network drive or just open computer and allow them to select a share? I don't want them to see the error message. A solution that looks to have been used for Office 2000/03 is: Key: HKEY_CURRENT_USER\Software\Microsoft\Office\<version>\Outlook Value name: ForceOSTPath Value type: REG_EXPAND_SZ Value: path to your storage folder I am not sure if there is a better way to do this now OR if this even works with Office 2010.

    Read the article

  • XBRL - Moving from Production to Consumption

    - by jmorourke
    Here's an update on what’s new with XBRL and how it can actually benefit your organization versus adding extra time and costs to financial reporting.  On February 29th (leap day) of 2012 I attended the XBRL and Financial Analysis Technology Conference at Baruch College in NYC.  The event, which attracted over 300 XBRL gurus and fans was presented by XBRL US, The New York Society of Security Analysts’ Improved Corporate Reporting Committee, and Baruch College’s Robert Zicklin Center for Corporate Integrity.  The event featured keynotes from the U.S. Securities and Exchange Commission (SEC), and the CFA Institute as well as panels covering alternative research tools and data, corporate reporting to stakeholders and a demonstration of XBRL analysis tools.  The program culminated in a presentation of the finalists and the winner of the $20,000 XBRL Challenge.    Some of the key points made in the sessions included: The focus of XBRL tools is moving from production to consumption. As of February 2012, over 9000 companies are reporting in XBRL, with over 10 million facts filed to date XBRL taxonomy extensions have dropped from 27% to 11% making comparisons easier The SEC reports that XBRL makes it easier to analyze disclosures, focus on accounting issues XBRL is helping standards-setters like the FASB speed their analysis of impacts of proposed accounting rule changes Companies like Thomson Reuters report that XBRL is helping speed the delivery of data to clients The most interesting part of the program though, was the session highlighting the 5 finalists in the XBRL Challenge competition and the winning solution.  The XBRL Challenge was launched in 2011 as a means of spurring the development of more end-user tools to help with the consumption of XBRL-based financial information.       Over an 8-month process handled by 5 judges, there were 84 registrants, 15 completed submissions, 5 finalists and one winner of the challenge.  All of the solutions are open-sourced tools and most of them focus on consuming XBRL-based data.  The 5 finalists included: Advanced XBRL Processing from Oxide solutions – XBRL viewer for taxonomies, filings and company data with peer comparison capabilities. Arrelle – API for XBRL processes, supports SEC Validations, RSS Feeds to access filings etc. Calcbench – XBRL data analysis tool that can be embedded in other web applications.  This tool can combine XBRL filings with real-time market data. XBRL to XL – allows the importing of XBRL data into Microsoft Excel for analysis, comparisons.  Users start on the web and populate Excel with XBRL data. XBurble – allows users to search and view XBRL filings, export to Excel, merge for comparison, and includes a workflow interface. The winner of the $20,000 XBRL Challenge prize was CalcBench.  More information about the XBRL Challenge and the finalists can be found at www.XBRLUS.org/challenge XBRL for Sustainability Reporting – other recent news on the XBRL front was the announcement by the Global Reporting Initiative (GRI) of an XBRL taxonomy for Sustainability Reporting.  This taxonomy was co-developed by the GRI and Deloitte and is designed to make the consumption of data found in Sustainability Reports much easier.  Although there is no government mandate to file Sustainability Reports in XBRL format, organizations that do use the GRI guidelines for Sustainability Reporting are encouraged to tag and submit their data voluntarily to the GRI – who will populate a database with Sustainability Reporting data and make this available to the public.  For more information about this initiative, you can go to the GRI web site:  www.globalreporting.org. So how does all of this benefit corporate filers and investors?  Since its introduction, the consensus in the market is that XBRL has mainly benefited the regulators and investment analysts who need to consume and analyze large volumes of financial data.  But with the emergence of more end-user tools for consuming and analyzing XBRL-based data, and the ability to perform quick comparisons of one company versus its peers and competitors in an industry group, will soon accelerate the benefits to corporate finance staff, as well as individual investors.  This could apply to financial results tagged in XBRL, as well as non-financial information such as Sustainability Reporting – which over the long-term will likely be integrated with financial reporting.   And as multiple regulators and agencies in a country adopt the XBRL standard for corporate filings, more benefits will accrue as companies will be able to leverage one set of XBRL-based financial data for multiple regulatory filings.     For more information about the latest developments in XBRL, check out the XBRL US or XBRL International web sites:  www.xbrl.org, www.xbrlus.org. For more information about what Oracle is doing to support XBRL, here are some links: http://www.oracle.com/us/solutions/ent-performance-bi/disclosure-management-065892.html http://www.oracle.com/technetwork/database/features/xmldb/index-087631.html Feel free to contact me if you have any questions or need more information:  [email protected]

    Read the article

  • Loading XML file containing leading zeros with SSIS preserving the zeros

    - by Compudicted
    Visiting the MSDN SQL Server Integration Services Forum oftentimes I could see that people would pop up asking this question: “why I am not able to load an element from an XML file that contains zeros so the leading/trailing zeros would remain intact?”. I started to suspect that such a trivial and often-required operation perhaps is being misunderstood by the developer community. I would also like to add that the whole state of affairs surrounding the XML today is probably also going to be increasingly affected by a motion of people who dislike XML in general and many aspects of it as XSD and XSLT invoke a negative reaction at best. Nevertheless, XML is in wide use today and its importance as a bridge between diverse systems is ever increasing. Therefore, I deiced to write up an example of loading an arbitrary XML file that contains leading zeros in one of its elements using SSIS so the leading zeros would be preserved keeping in mind the goal on simplicity into a table in SQL Server database. To start off bring up your BIDS (running as admin) and add a new Data Flow Task (DFT). This DFT will serve as container to adding our XML processing elements (besides, the XML Source is not available anywhere else other than from within the DFT). Double-click your DFT and drag and drop the XMS Source component from the Tool Box’s Data Flow Sources. Now, let the fun begin! Being inspired by the upcoming Christmas I created a simple XML file with one set of data that contains an imaginary SSN number of Rudolph containing several leading zeros like 0000003. This file can be viewed here. To configure the XML Source of course it is quite intuitive to point it to our XML file, next what the XML source needs is either an embedded schema (XSD) or it can generate one for us. In lack of the one I opted to auto-generate it for me and I ended up with an XSD that looked like: <?xml version="1.0"?> <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:element name="XMasEvent"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="CaseInfo"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="ID" type="xs:unsignedByte" /> <xs:element minOccurs="0" name="CreatedDate" type="xs:unsignedInt" /> <xs:element minOccurs="0" name="LastName" type="xs:string" /> <xs:element minOccurs="0" name="FirstName" type="xs:string" /> <xs:element minOccurs="0" name="SSN" type="xs:unsignedByte" /> <!-- Becomes string -- > <xs:element minOccurs="0" name="DOB" type="xs:unsignedInt" /> <xs:element minOccurs="0" name="Event" type="xs:string" /> <xs:element minOccurs="0" name="ClosedDate" /> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xs:schema> As an aside on the XML file: if your XML file does not contain the outer node (<XMasEvent>) then you may end up in a situation where you see just one field in the output. Now please note that the SSN element’s data type was chosen to be of unsignedByte (and this is for a reason). The reason is stemming from the fact all our figures in the element are digits, this is good, but this is not exactly what we need, because if we will attempt to load the data with this XSD then we are going to either get errors on the destination or most typically lose the leading zeros. So the next intuitive choice is to change the data type to string. Besides, if a SSIS package was already created based on this XSD and the data type change was done thereafter, one should re-set the metadata by right-clicking the XML Source and choosing “Advanced Editor” in which there is a refresh button at the bottom left which will do the trick. So far so good, we are ready to load our XML file, well actually yes, and no, in my experience typically some data conversion may be required. So depending on your data destination you may need to tweak the data types targeted. Let’s add a Data Conversion Task to our DFT. Your package should look like: To make the story short I only will cover the SSN field, so in my data source the target SQL Table has it as nchar(10) and we chose string in our XSD (yes, this is a big difference), under such circumstances the SSIS will complain. So will go and manipulate on the data type of SSN by making it Unicode String (DT_WSTR), World String per se. The conversion should look like: The peek at the Metadata: We are almost there, now all we need is to configure the destination. For simplicity I chose SQL Server Destination. The mapping is a breeze, F5 and I am able to insert my data into SQL Server now! Checking the zeros – they are all intact!

    Read the article

  • Important Security Issue: Is it possible to put binary image data into html markup code and then get

    - by Joern Akkermann
    Hi, it's an important security issue and I'm sure this should be possible. A simple example: You run a community portal. Users are registered and upload their pictures. Your application gives security rules wenever a picture is allowed to be displayed. For example users must be friends on each sides by the system, in order that you can view someone elses uploaded pictures. Here comes the problem: it is possible that someone crawls the image directories of your server. But you want to protect your users from such attacks. If it's possible to put the binary data of an image directly into the html markup, you can restrict the user access of your image dirs the user and group your web application runs of and pass the image data to your apache user and group directly in the html. The only possible weakness then is the password of the user that your web app runs as. Is there already a possibility? Yours, Joern.

    Read the article

  • How to store or share live data between PHP Requests?

    - by Devyn
    Hi, I want to start a project for facebook and the application will be like real-time multiplayer chess game. The problem I'm having is I have no idea how to store the data when a player moves one piece and update the new position in player2 browser. I'm gonna use PHP, MySQL for server side and jQuery for Client Rendering. The simplest idea is to store the data in XML or MySQL and re-generate the result to player2 browser. But I know that when thousand of players are playing, it will not be an efficient way. Since I don't have time to study new language for this project, I'm gonna have to stick with PHP. I'm not going to use flash either because I want my client side light-weight and flash-free. So is there any way that will solve my problems?

    Read the article

  • Real Time BI in the Real World

    - by tobin.gilman(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";} One of my favorite BI offerings from Oracle is a solution called Oracle Real Time Decisions.  Whenever I mention this product in customer meetings, eyes light up.  There are some fascinating examples of customers using it to up-sell, cross-sell, increase customer retention, and reduce risk in real time, with off the charts return on investment. I plan to share some of those stories in a future blog.  In this post however, I want to share some far more common real time analytics use case scenarios that are being addressed with widely deployed Oracle BI and data integration technologies Not all real time BI applications require continuous learning, predictive modeling, and data mining.  Many simply require the ability to integrate, aggregate, and access information that is current (typically within in few minutes or a few seconds).  The use cases are infinite.  A few I've seen: ·         Purchasing agents need to match demand against available inventory ·         Manufacturing planners need to monitor current parts and material against scheduled build plans ·         Airline agents need to match ticket demand against flight schedules, ·         Human resources managers need to track the status of global hiring requisitions against current headcount authorizations...you get the idea. One way of doing this is to run reports or federated queries directly against transactional systems.  That approach can be viable if you only need to access simple data sets on rare occasions.  High volume and complex queries can quickly bog down performance of mission critical transactional systems.  There is an architecturally simple way of solving the problem, and it's being applied by real companies around the world to solve real needs in real time.    Cbeyond is an Atlanta, GA based  provider of voice, data and mobile business applications delivers.  They deliver real time information to its call center agents  as they are interacting with their customers. The data they need resides in production CRM and other transactional systems, but  instead or reporting directly off the those systems, data is first moved to an operational data store (ODS).  Rather than running data intensive, time consuming, and performance degrading batch ETL routines to populate the ODS, Cbeyond uses Oracle Golden Gate software to incrementally capture and move only the changed records from log files of the transactional systems every few minutes.  There is no impact on transactional system performance, and the information needed by call center representatives is up to date.  Oracle Business Intelligence software presents the information to services reps in a rich, visual, and highly interactive format. Avea is similar to Cbeyond.  They are a telecommunications company who integrates billing and customer information in an ODS that is accessed by their call center agents in real time using Oracle Golden Gate and Oracle Business Intelligence.  They've taken it a step further by using the ODS to feed a data warehouse.  The operational data store provides the current information needed by call center agents during "in flight" customer interactions.  The data warehouse is used for more sophisticated analysis of historical data.  For maximum performance, both the ODS and data warehouse run on the Oracle Exadata Database Machine. These are practical illustrations of companies addressing real time reporting and analysis needs using established business intelligence/data warehousing methodologies and tools common to many IT departments.  If real time BI could benefit your organization, you may be already be closer than you thought to having the pieces in place to solving the problem.    Give us a shout if you are interested in learning more or if you have an interesting use or approach to real-time BI.

    Read the article

  • It is possible to record a data that have a straight row in mysql based on date or sequence?

    - by user1987816
    I want to get the data that have a straight Sell more than 3 times, it is possible in mysql? If not, how to get it right? I'm need it on mysql or php. my database:- +----------+---------------------+--------+ | Username | Date | Action | +----------+---------------------+--------+ | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Buy | | Adam | 2014-08-20 22:30:20 | Buy | | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Sell | | Adam | 2014-08-20 22:30:20 | Sell | | Nick | 2014-08-20 22:30:20 | Sell | | Nick | 2014-08-20 22:30:20 | Sell | | Nick | 2014-08-20 22:30:20 | Sell | | Nick | 2014-08-20 22:30:20 | Sell | | Nick | 2014-08-20 22:30:20 | Buy | +----------+---------------------+--------+ From the table above, I need to list out all data that have a straight sell more then 3 times. RESULT +----------+---------------------+--------+-------------+ | Username | Date | Action | Straight 3+ | +----------+---------------------+--------+-------------+ | Adam | 2014-08-20 22:30:20 | Sell | 3 | | Adam | 2014-08-20 22:30:20 | Sell | 4 | | Nick | 2014-08-20 22:30:20 | Sell | 4 | +----------+---------------------+--------+-------------+

    Read the article

  • Is it possible to put binary image data into html markup and then get the image displayed as usual i

    - by Joern Akkermann
    It's an important security issue and I'm sure this should be possible. A simple example: You run a community portal. Users are registered and upload their pictures. Your application gives security rules whenever a picture is allowed to be displayed. For example users must be friends on each sides by the system, in order that you can view someone else's uploaded pictures. Here comes the problem: it is possible that someone crawls the image directories of your server. But you want to protect your users from such attacks. If it's possible to put the binary data of an image directly into the HTML markup, you can restrict the user access of your image dirs the user and group your web application runs of and pass the image data to your Apache user and group directly in the HTML. The only possible weakness then is the password of the user that your web app runs as. Is there already a possibility?

    Read the article

  • What database works well with 200+GB of data?

    - by taw
    I've been using mysql (with innodb; on Amazon rds) because it's sort of universal default, but it's been ridiculously under-performing, and tweaking it only delays the inevitable. The data is mostly relatively short (<1kB of bytes each) blobs information about 100Ms of urls. There is (or should be, mysql cannot seem to handle it) very high amount of insert / update / retrieve but few complex queries - not that complex queries wouldn't be useful, but because mysql is so slow that it's far faster to get the data out, process it locally, and cache the results somewhere. I can keep tweaking mysql and throwing more hardware at it, but it seems increasingly futile. So what are the options? SQL/relational model/etc. optional - anything will do as long as it's fast, networked, and language-independent.

    Read the article

  • PHP PDO: Fetching data as objects - properties assigned BEFORE __construct is called. Is this correc

    - by Erik
    The full question should be "Is this correct or is it some bug I can't count on?" I've been working with PDO more and in particular playing with fetching data directly into objects. While doing so I discovered this: If I fetch data directly into an object like this: $STH = $DBH->prepare('SELECT first_name, address from people WHERE 1'); $obj = $STH->fetchAll(PDO::FETCH_CLASS, 'person'); and have an object like this: class person { public $first_name; public $address; function __construct() { $this->first_name = $this->first_name . " is the name"; } } It shows me that the properties are being assigned before the __construct is being called -- because the names all have " is the name" appended. Is this some bug (in which case I can't/won't count on it) or is this The Way It Should Be. Because it's really quite a nice thing the way it works currently.

    Read the article

  • how to show all method and data when the object not has "__iter__" function in python..

    - by zjm1126
    i find a way : (1):the dir(object) is : a="['__class__', '__contains__', '__delattr__', '__delitem__', '__dict__', '__doc__', '__getattribute__', '__getitem__', '__hash__', '__init__', '__iter__', '__metaclass__', '__module__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__setitem__', '__str__', '__weakref__', '_errors', '_fields', '_prefix', '_unbound_fields', 'confirm', 'data', 'email', 'errors', 'password', 'populate_obj', 'process', 'username', 'validate']" (2): b=eval(a) (3)and it became a list of all method : ['__class__', '__contains__', '__delattr__', '__delitem__', '__dict__', '__doc__', '__getattribute__', '__getitem__', '__hash__', '__init__', '__iter__', '__metaclass__', '__module__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__setitem__', '__str__', '__weakref__', '_errors', '_fields', '_prefix', '_unbound_fields', 'confirm', 'data', 'email', 'errors', 'password', 'populate_obj', 'process', 'username', 'validate'] (3)then show the object's method,and all code is : s='' a=eval(str(dir(object))) for i in a: s+=str(i)+':'+str(object[i]) print s but it show error : KeyError: '__class__' so how to make my code running . thanks

    Read the article

  • How do you data drive task dependencies via properties in psake?

    - by Jordan
    In MSBuild you can data drive target dependencies by passing a item group into a target, like so: <ItemGroup> <FullBuildDependsOn Include="Package;CoreFinalize" Condition="@(FullBuildDependsOn) == ''" /> </ItemGroup> <Target Name="FullBuild" DependsOnTargets="@(FullBuildDependsOn)" /> If you don't override the FullBuildDependsOn item group, the FullBuild target defaults to depending on the Package and CoreFinalize targets. However, you can override this by defining your own FullBuildDependsOn item group. I'd like to do the same in psake - for example: properties { $FullBuildDependsOn = "Package", "CoreFinalize" } task default -depends FullBuild # this won't work because $FullBuildDependsOn hasn't been defined yet - the "Task" function will see this as a null depends array task FullBuild -depends $FullBuildDependsOn What do I need to do to data drive the task dependencies in psake?

    Read the article

  • How can I calculate data for a boxplot (quartiles, median) in a Rails app on Heroku? (Heroku uses Po

    - by hadees
    I'm trying to calculate the data needed to generate a box plot which means I need to figure out the 1st and 3rd Quartiles along with the median. I have found some solutions for doing it in Postgresql however they seem to depend on either PL/Python or PL/R which it seems like Heroku does not have either enabled for their postgresql databases. In fact I ran "select lanname from pg_language;" and only got back "internal". I also found some code to do it in pure ruby but that seems somewhat inefficient to me. I'm rather new to Box Plots, Postgresql, and Ruby on Rails so I'm open to suggestions on how I should handle this. There is a possibility to have a lot of data which is why I'm concerned with performance however if the solution ends up being too complex I may just do it in ruby and if my application gets big enough to warrant it get my own Postgresql I can host somewhere else. *note: since I was only able to post one link, cause I'm new, I decided to share a pastie with some relevant information

    Read the article

  • Best Way to transfer data From one page to another In asp.net.

    - by Anish Karunakaran
    In my scenario am using java script popup that popups another form Which is an address Entry form of merely 20 Controls in it. Now am retrieving data from address page to main page by using a session variable which stores a data table of values. Two different Session variables are this way used for permanent and temporary addresses.. Using session variable degrades the performance i know.. What is the best way to transfer value from one page to another. Regards anish.

    Read the article

  • Whats the most efficient MySQL column types for this data?

    - by AlabamaKush
    I have several tables with some pretty standard data in each. Can somebody help me optimize them by telling me the best column types for this data. Whats beside them is what I have currently. Number (max length 7) --> MEDIUMINT(8) Unsigned Text (max length 30) --> VARCHAR(30) Text (max length 200) --> VARCHAR(200) Number (max length 4) --> SMALLINT(5) Unsigned Number (either 0 or 1) --> TINYINT(1) Unsigned Text (max length 500) --> TEXT Any suggestions? I'm just guessing with this so I know some of them are wrong...

    Read the article

  • Is it better to use List or Collection?

    - by Vivin Paliath
    I have an object that stores some data in a list. The implementation could change later, and I don't want to expose the internal implementation to the end user. However, the user must have the ability to modify and access this collection of data. Currently I have something like this: public List<SomeDataType> getData() { return this.data; } public void setData(List<SomeDataType> data) { this.data = data; } Does this mean that I have allowed the internal implementation details to leak out? Should I be doing this instead? public Collection<SomeDataType> getData() { return this.data; } public void setData(Collection<SomeDataType> data) { this.data = new ArrayList<SomeDataType>(data); }

    Read the article

  • XSS as attack vector even if XSS data not stored?

    - by Klaas van Schelven
    I have a question about XSS Can forms be used as a vector for XSS even if the data is not stored in the database and used at a later point? i.e. in php the code would be this: <form input="text" value="<?= @$_POST['my_field'] ?>" name='my_field'> Showing an alert box (demonstrate that JS can be run) on your own browser is trivial with the code above. But is this exploitable across browsers as well? The only scenario I see is where you trick someone into visiting a certain page, i.e. a combination of CSRF and XSS. "Stored in a database and used at a later point": the scenario I understand about CSS is where you're able to post data to a site that runs JavaScript and is shown on a page in a browser that has greater/different privileges than your own. But, to be clear, this is not wat I'm talking about above.

    Read the article

< Previous Page | 455 456 457 458 459 460 461 462 463 464 465 466  | Next Page >