Search Results

Search found 10644 results on 426 pages for 'flash integration'.

Page 53/426 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • How to Avoid Your Next 12-Month Science Project

    - by constant
    While most customers immediately understand how the magic of Oracle's Hybrid Columnar Compression, intelligent storage servers and flash memory make Exadata uniquely powerful against home-grown database systems, some people think that Exalogic is nothing more than a bunch of x86 servers, a storage appliance and an InfiniBand (IB) network, built into a single rack. After all, isn't this exactly what the High Performance Computing (HPC) world has been doing for decades? On the surface, this may be true. And some people tried exactly that: They tried to put together their own version of Exalogic, but then they discover there's a lot more to building a system than buying hardware and assembling it together. IT is not Ikea. Why is that so? Could it be there's more going on behind the scenes than merely putting together a bunch of servers, a storage array and an InfiniBand network into a rack? Let's explore some of the special sauce that makes Exalogic unique and un-copyable, so you can save yourself from your next 6- to 12-month science project that distracts you from doing real work that adds value to your company. Engineering Systems is Hard Work! The backbone of Exalogic is its InfiniBand network: 4 times better bandwidth than even 10 Gigabit Ethernet, and only about a tenth of its latency. What a potential for increased scalability and throughput across the middleware and database layers! But InfiniBand is a beast that needs to be tamed: It is true that Exalogic uses a standard, open-source Open Fabrics Enterprise Distribution (OFED) InfiniBand driver stack. Unfortunately, this software has been developed by the HPC community with fastest speed in mind (which is good) but, despite the name, not many other enterprise-class requirements are included (which is less good). Here are some of the improvements that Oracle's InfiniBand development team had to add to the OFED stack to make it enterprise-ready, simply because typical HPC users didn't have the need to implement them: More than 100 bug fixes in the pieces that were not related to the Message Passing Interface Protocol (MPI), which is the protocol that HPC users use most of the time, but which is less useful in the enterprise. Performance optimizations and tuning across the whole IB stack: From Switches, Host Channel Adapters (HCAs) and drivers to low-level protocols, middleware and applications. Yes, even the standard HPC IB stack could be improved in terms of performance. Ethernet over IB (EoIB): Exalogic uses InfiniBand internally to reach high performance, but it needs to play nicely with datacenters around it. That's why Oracle added Ethernet over InfiniBand technology to it that allows for creating many virtual 10GBE adapters inside Exalogic's nodes that are aggregated and connected to Exalogic's IB gateway switches. While this is an open standard, it's up to the vendor to implement it. In this case, Oracle integrated the EoIB stack with Oracle's own IB to 10GBE gateway switches, and made it fully virtualized from the beginning. This means that Exalogic customers can completely rewire their server infrastructure inside the rack without having to physically pull or plug a single cable - a must-have for every cloud deployment. Anybody who wants to match this level of integration would need to add an InfiniBand switch development team to their project. Or just buy Oracle's gateway switches, which are conveniently shipped with a whole server infrastructure attached! IPv6 support for InfiniBand's Sockets Direct Protocol (SDP), Reliable Datagram Sockets (RDS), TCP/IP over IB (IPoIB) and EoIB protocols. Because no IPv6 = not very enterprise-class. HA capability for SDP. High Availability is not a big requirement for HPC, but for enterprise-class application servers it is. Every node in Exalogic's InfiniBand network is connected twice for redundancy. If any cable or port or HCA fails, there's always a replacement link ready to take over. This requires extra magic at the protocol level to work. So in addition to Weblogic's failover capabilities, Oracle implemented IB automatic path migration at the SDP level to avoid unnecessary failover operations at the middleware level. Security, for example spoof-protection. Another feature that is less important for traditional users of InfiniBand, but very important for enterprise customers. InfiniBand Partitioning and Quality-of-Service (QoS): One of the first questions we get from customers about Exalogic is: “How can we implement multi-tenancy?” The answer is to partition your IB network, which effectively creates many networks that work independently and that are protected at the lowest networking layer possible. In addition to that, QoS allows administrators to prioritize traffic flow in multi-tenancy environments so they can keep their service levels where it matters most. Resilient IB Fabric Management: InfiniBand is a self-managing network, so a lot of the magic lies in coming up with the right topology and in teaching the subnet manager how to properly discover and manage the network. Oracle's Infiniband switches come with pre-integrated, highly available fabric management with seamless integration into Oracle Enterprise Manager Ops Center. In short: Oracle elevated the OFED InfiniBand stack into an enterprise-class networking infrastructure. Many years and multiple teams of manpower went into the above improvements - this is something you can only get from Oracle, because no other InfiniBand vendor can give you these features across the whole stack! Exabus: Because it's not About the Size of Your Network, it's How You Use it! So let's assume that you somehow were able to get your hands on an enterprise-class IB driver stack. Or maybe you don't care and are just happy with the standard OFED one? Anyway, the next step is to actually leverage that InfiniBand performance. Here are the choices: Use traditional TCP/IP on top of the InfiniBand stack, Develop your own integration between your middleware and the lower-level (but faster) InfiniBand protocols. While more bandwidth is always a good thing, it's actually the low latency that enables superior performance for your applications when running on any networking infrastructure: The lower the latency, the faster the response travels through the network and the more transactions you can close per second. The reason why InfiniBand is such a low latency technology is that it gets rid of most if not all of your traditional networking protocol stack: Data is literally beamed from one region of RAM in one server into another region of RAM in another server with no kernel/drivers/UDP/TCP or other networking stack overhead involved! Which makes option 1 a no-go: Adding TCP/IP on top of InfiniBand is like adding training wheels to your racing bike. It may be ok in the beginning and for development, but it's not quite the performance IB was meant to deliver. Which only leaves option 2: Integrating your middleware with fast, low-level InfiniBand protocols. And this is what Exalogic's "Exabus" technology is all about. Here are a few Exabus features that help applications leverage the performance of InfiniBand in Exalogic: RDMA and SDP integration at the JDBC driver level (SDP), for Oracle Weblogic (SDP), Oracle Coherence (RDMA), Oracle Tuxedo (RDMA) and the new Oracle Traffic Director (RDMA) on Exalogic. Using these protocols, middleware can communicate a lot faster with each other and the Oracle database than by using standard networking protocols, Seamless Integration of Ethernet over InfiniBand from Exalogic's Gateway switches into the OS, Oracle Weblogic optimizations for handling massive amounts of parallel transactions. Because if you have an 8-lane Autobahn, you also need to improve your ramps so you can feed it with many cars in parallel. Integration of Weblogic with Oracle Exadata for faster performance, optimized session management and failover. As you see, “Exabus” is Oracle's word for describing all the InfiniBand enhancements Oracle put into Exalogic: OFED stack enhancements, protocols for faster IB access, and InfiniBand support and optimizations at the virtualization and middleware level. All working together to deliver the full potential of InfiniBand performance. Who else has 100% control over their middleware so they can develop their own low-level protocol integration with InfiniBand? Even if you take an open source approach, you're looking at years of development work to create, test and support a whole new networking technology in your middleware! The Extras: Less Hassle, More Productivity, Faster Time to Market And then there are the other advantages of Engineered Systems that are true for Exalogic the same as they are for every other Engineered System: One simple purchasing process: No headaches due to endless RFPs and no “Will X work with Y?” uncertainties. Everything has been engineered together: All kinds of bugs and problems have been already fixed at the design level that would have only manifested themselves after you have built the system from scratch. Everything is built, tested and integrated at the factory level . Less integration pain for you, faster time to market. Every Exalogic machine world-wide is identical to Oracle's own machines in the lab: Instant replication of any problems you may encounter, faster time to resolution. Simplified patching, management and operations. One throat to choke: Imagine finger-pointing hell for systems that have been put together using several different vendors. Oracle's Engineered Systems have a single phone number that customers can call to get their problems solved. For more business-centric values, read The Business Value of Engineered Systems. Conclusion: Buy Exalogic, or get ready for a 6-12 Month Science Project And here's the reason why it's not easy to "build your own Exalogic": There's a lot of work required to make such a system fly. In fact, anybody who is starting to "just put together a bunch of servers and an InfiniBand network" is really looking at a 6-12 month science project. And the outcome is likely to not be very enterprise-class. And it won't have Exalogic's performance either. Because building an Engineered System is literally rocket science: It takes a lot of time, effort, resources and many iterations of design/test/analyze/fix to build such a system. That's why InfiniBand has been reserved for HPC scientists for such a long time. And only Oracle can bring the power of InfiniBand in an enterprise-class, ready-to use, pre-integrated version to customers, without the develop/integrate/support pain. For more details, check the new Exalogic overview white paper which was updated only recently. P.S.: Thanks to my colleagues Ola, Paul, Don and Andy for helping me put together this article! var flattr_uid = '26528'; var flattr_tle = 'How to Avoid Your Next 12-Month Science Project'; var flattr_dsc = 'While most customers immediately understand how the magic of Oracle's Hybrid Columnar Compression, intelligent storage servers and flash memory make Exadata uniquely powerful against home-grown database systems, some people think that Exalogic is nothing more than a bunch of x86 servers, a storage appliance and an InfiniBand (IB) network, built into a single rack.After all, isn't this exactly what the High Performance Computing (HPC) world has been doing for decades?On the surface, this may be true. And some people tried exactly that: They tried to put together their own version of Exalogic, but then they discover there's a lot more to building a system than buying hardware and assembling it together. IT is not Ikea.Why is that so? Could it be there's more going on behind the scenes than merely putting together a bunch of servers, a storage array and an InfiniBand network into a rack? Let's explore some of the special sauce that makes Exalogic unique and un-copyable, so you can save yourself from your next 6- to 12-month science project that distracts you from doing real work that adds value to your company.'; var flattr_tag = 'Engineered Systems,Engineered Systems,Infiniband,Integration,latency,Oracle,performance'; var flattr_cat = 'text'; var flattr_url = 'http://constantin.glez.de/blog/2012/04/how-avoid-your-next-12-month-science-project'; var flattr_lng = 'en_GB'

    Read the article

  • Flash CS3/AS3 - How to Mask Nested MovieClips in External Classes

    - by Max Jackson
    I have a number of external class files that make up (or are trying to build) a portfolio. One of the class files for this project is a Menu.as class I tried extends, but I'm yet to use extends to where it doesn't become a ball of tangled holiday cheer. So my main portfolio class (the one where I'm assembling everything) calls an instance of the Menu class. From the Preloader through the Portfolio class into the Menu class is where I'm passing the content because I want to package things properly. This is Menu content, so naturally I want to position it in a properly names spot. I'm trying to reveal this Menu in a mask and I'm getting the old #1009 error. In a trace, this will work: trace(site_mc.menu_mc.mainMask_mc); // returns [object mainMask_mc_4] However, when I try to truncate the string into a single compact_mc... compactMenu_mc = site_mc.menu_mc.mainMask_mc; trace(compact_mc); // it won't trace (#1009). I said to hell with it, but now I need to have one MovieClip mask another. So I figure I can't go all... parent.parent.parent.clip_mc.mask = parent.parent.parent.masked_mc Probably because of datatyping and whatever else. I hate to be vague, but I'm new and have been working like gangbusters for days to get this portfolio up. Any suggestions or pointers on things my noob brain might've missed are given much thanks. :)

    Read the article

  • retreive POST data from FLASH to ASP.Net

    - by Martin Ongtangco
    here's my AS3 code: var jpgEncoder:JPGEncoder = new JPGEncoder(100); var jpgStream:ByteArray = jpgEncoder.encode(bitmapData); var header:URLRequestHeader = new URLRequestHeader("Content-type", "application/octet-stream"); var jpgURLRequest:URLRequest = new URLRequest("/patients/webcam.aspx"); jpgURLRequest.requestHeaders.push(header); jpgURLRequest.method = URLRequestMethod.POST; jpgURLRequest.data = jpgStream; navigateToURL(jpgURLRequest, "_self"); And here's my ASP.Net Code try { string pt = Path.Combine(PathFolder, "test.jpg"); HttpFileCollection fileCol = Request.Files; Response.Write(fileCol.Count.ToString()); foreach (HttpPostedFile hpf in fileCol) { hpf.SaveAs(pt); } } catch (Exception ex) { Response.Write(ex.Message); } im getting a weird error, HttpFox mentioned: "NS_ERROR_NET_RESET" Any help would be excellent! Thanks!

    Read the article

  • Progress bar in a Flash MP3 Player

    - by Deryck
    Hi I have coded a simple XML driven MP3 player. I have used Sound and SoundChannel objects and method but I can´t find a way of make a progress bar. I don´t need a loading progress I need a song progress status bar. Canbd anybody help me? Thanks. UPDATE: Theres is the code. var musicReq: URLRequest; var thumbReq: URLRequest; var music:Sound = new Sound(); var sndC:SoundChannel; var currentSnd:Sound = music; var position:Number; var currentIndex:Number = 0; var songPaused:Boolean; var songStopped:Boolean; var lineClr:uint; var changeClr:Boolean; var xml:XML; var songList:XMLList; var loader:URLLoader = new URLLoader(); loader.addEventListener(Event.COMPLETE, Loaded); loader.load(new URLRequest("musiclist.xml")); var thumbHd:MovieClip = new MovieClip(); thumbHd.x = 50; thumbHd.y = 70; addChild(thumbHd); function Loaded(e:Event):void{ xml = new XML(e.target.data); songList = xml.song; musicReq = new URLRequest(songList[0].url); thumbReq = new URLRequest(songList[0].thumb); music.load(musicReq); sndC = music.play(); title_txt.text = songList[0].title + " - " + songList[0].artist; loadThumb(); sndC.addEventListener(Event.SOUND_COMPLETE, nextSong); } function loadThumb():void{ var thumbLoader:Loader = new Loader(); thumbReq = new URLRequest(songList[currentIndex].thumb); thumbLoader.load(thumbReq); thumbLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, thumbLoaded); } function thumbLoaded(e:Event):void { var thumb:Bitmap = (Bitmap)(e.target.content); var holder:MovieClip = thumbHd; holder.addChild(thumb); } prevBtn.addEventListener(MouseEvent.CLICK, prevSong); nextBtn.addEventListener(MouseEvent.CLICK, nextSong); playBtn.addEventListener(MouseEvent.CLICK, playSong); function prevSong(e:Event):void{ if(currentIndex 0){ currentIndex--; } else{ currentIndex = songList.length() - 1; } var prevReq:URLRequest = new URLRequest(songList[currentIndex].url); var prevPlay:Sound = new Sound(prevReq); sndC.stop(); title_txt.text = songList[currentIndex].title + " - " + songList[currentIndex].artist; sndC = prevPlay.play(); currentSnd = prevPlay; songPaused = false; loadThumb(); sndC.addEventListener(Event.SOUND_COMPLETE, nextSong); } function nextSong(e:Event):void { if(currentIndex And here the code for the lenght and position. It´s inside a MovieClip. That´s why I use absolute path for find the Sound object. this.addEventListener(Event.ENTER_FRAME, moveSpeaker); var initWidth:Number = this.SpkCone.width; var initHeight:Number = this.SpkCone.height; var rootObj:Object = root; function moveSpeaker(eventArgs:Event) { var average:Number = ((rootObj.audioPlayer_mc.sndC.leftPeak + rootObj.audioPlayer_mc.sndC.rightPeak) / 2) * 10; // trace(average); // trace(initWidth + ":" + initHeight); trace(rootObj.audioPlayer_mc.sndC.position + "/" + rootObj.audioPlayer_mc.music.length); this.SpkCone.width = initWidth + average; this.SpkCone.height = initHeight + average; }

    Read the article

  • js popup window to play .flv flash video using jwplayer.swf

    - by Mike Trader
    js popup window to play .flv using jwplayer.swf I would like to adjust this code so that it does not crash the browser when expanded to full screen and the popup closes when it looses focus <html> <head> <title>Popup Example</title> <center> <div class="yt_container"> <div id="yt_the_video" class="yt_video_full"> <script type="text/javascript" src="swfobject.js"></script> <script type="text/javascript"> var s1 = new SWFObject("player.swf","ply","640","500","9","#FFFFFF"); s1.addParam("allowfullscreen","true"); s1.addParam("allownetworking","all"); s1.addParam("allowscriptaccess","always"); s1.addParam("flashvars",'&file=GJClip.flv&autostart=true'); </script> </head> <body &#10;&#10;bgcolor="#CCCFFF"> <img alt="GJ" src="GJPlay.jpg"&#10;width="80" height="60" onClick="s1.write('yt_the_video');"&#10;&#10;</body> </html> I have to have many small thumbnails on a page and each one needs to open up to a full size (640x480) video with controls when clicked. Having looked at Shdowbox (dims web page behind it, not allowed to do that) and lightbox which I cannot get to work at all, I am down to a home gown solution, which I prefer anyway.

    Read the article

  • Hudson.. another Continuous Integration tool

    - by Narendra Tiwari
    In my previous posts I discussed about Cruisecontrol.net and its legacy support to .Net development. Hudson  is yet another continuous integration tool. Hudson is also free like CCNet and built in java. - CCNet has its legacy support to .Net applications where as Hudson can be easily configured on both the environments (.Net and Java). - One of the major differences in CCNet and Hudson is the richer GUI of Hudson provide user interactive screens for project configuration where as in CCNet we have to play with a few xml configuration files. Both the tools are capable of providing basic features of continuous integration e.g.:- - Source Control configuration - Code Compilation/Build - Ad hoc plugin tools to be configured along with compilation Support for adhoc tools seems to be bigger with CCNet e.g. There are almost every source control plugin available with CCNet where as Hudson has support for limited source control servers. Basically there is an interseting point to see is that there are 2 major partsof whole CI system one performed by build tool and rest. Build tool takes care of all adhoc plugin tools  so no matter if CI tool does not have plugin for that tool if thet tools provides command line support that can be configured in build tool and that build tool is then configured with CI tool inturn. For example if I have a build script configured in MSBuild and CCNet can be easily switched to Hudson. Here we need not to change anything in build script we just need to configure MSBuild on Hudson and pass the path of script file and thats it... all is same. Hudson Resources:- - https://hudson.dev.java.net/ - http://wiki.hudson-ci.org/display/HUDSON/Meet+Hudson - http://wiki.hudson-ci.org/display/HUDSON/Plugins - http://callport.blogspot.com/2009/02/hudson-for-net-projects.html Java support on CCNet http://confluence.public.thoughtworks.org/display/CC/Getting+Started+With+CruiseControl?focusedCommentId=19988484#comment-19988484 Please share your thoughts...

    Read the article

  • Facebook Graph API - Image Uploading (as3/flash)

    - by lollertits
    I have been trying to get a bit more familiar with the Graph API for facebook. Its very convenient although the documentation is poor at some places. Im having trouble uploading an image to an album. Anyone know how to do it ? This is the code im currently working on :) private function uploadNewPic(albumId:String):void { var bmd1:BitmapData = new BitmapData(200, 200, false, 0x666666); var bm1:Bitmap = new Bitmap(bmd1); var jpgEncoder:JPGEncoder = new JPGEncoder(); var ba:ByteArray = jpgEncoder.encode(bmd1); var data:URLVariables = new URLVariables(); data.message = "Message"; data.image = ba; data.photos = ba; data.url = ba; var method:String = URLRequestMethod.POST; var loader:URLLoader = facebook.call(albumId + "/photos", data, method); loader.addEventListener(FacebookOAuthGraphEvent.ERROR, onPicError); loader.addEventListener(FacebookOAuthGraphEvent.DATA, onPicUploaded); } Im pretty much down to trial and error :) Any ideas ?

    Read the article

  • Oracle SOA Suite for healthcare integration Dashboard By Nitesh Jain

    - by JuergenKress
    Oracle SOA Suite Healthcare came up with a new way of monitoring where user can configure a dashboard and follow the dynamic runtime changes. Oracle SOA Suite for healthcare integration dashboards display information about the current health of the endpoints in a healthcare integration application. You can create and configure multiple dashboards as needed to monitor the status and volume metrics for the endpoints you have defined. The Dashboards reflects changes that occur in the runtime repository, such as purging runtime instance data, new messages processed, and new error messages. You can display data for various time periods, and you can manually refresh the data in real time or set the dashboard to automatically refresh at set intervals. Dashboard shows the following information: Status: The current status of the endpoint, such as Running, Idle, Disabled, or Errors. Messages Sent: The number of messages sent by the endpoint in the specified time period. Messages Received: The number of messages received by the endpoint in the specified time period. Errors: The number of messages with errors for the endpoint in the given time period. Last Sent: The date and time the last message was sent from the endpoint. Last Received: The date and time the last message was received from the endpoint. Last Error: The date and time of the last error for the endpoint. It also shows the detailed view of a specific Endpoint. The document type. The number of messages received per second. The total number of message processed in the specified time period. The average size of each message. For more information please visit Nitesh Jain blog SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Suite,SOA heathcare,soa health,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Flex/Flash 4 datagrid displays raw xml

    - by Setori
    Problem: Flex/Flash4 client (built with FlashBuilder4) displays the xml sent from the server exactly as is - the datagrid keeps the format of the xml. I need the datagrid to parse the input and place the data in the correct rows and columns of the datagrid. flow: click on a date in the tree and it makes a server request for batch information in xml form. Using a CallResponder I then update the datagrid's dataProvider. [code] <fx:Script> <![CDATA[ import mx.controls.Alert; [Bindable]public var selectedTreeNode:XML; public function taskTreeChanged(event:Event):void { selectedTreeNode=Tree(event.target).selectedItem as XML; var searchHubId:String = selectedTreeNode.@hub; var searchDate:String = selectedTreeNode.@lbl; if((searchHubId == "") || (searchDate == "")){ return; } findShipmentBatches(searchDate,searchHubId); } protected function findShipmentBatches(searchDate:String, searchHubId:String):void{ findShipmentBatchesResult.token = actWs.findShipmentBatches(searchDate, searchHubId); } protected function updateBatchDataGridDP():void{ task_list_dg.dataProvider = findShipmentBatchesResult.lastResult; } ]]> </fx:Script> <fx:Declarations> <actws:ActWs id="actWs" fault="Alert.show(event.fault.faultString + '\n' + event.fault.faultDetail)" showBusyCursor="true"/> <s:CallResponder id="findShipmentBatchesResult" result="updateBatchDataGridDP()"/> </fx:Declarations> <mx:AdvancedDataGrid id="task_list_dg" width="100%" height="95%" paddingLeft="0" paddingTop="0" paddingBottom="0"> <mx:columns> <mx:AdvancedDataGridColumn headerText="Receiving date" dataField="rd"/> <mx:AdvancedDataGridColumn headerText="Msg type" dataField="mt"/> <mx:AdvancedDataGridColumn headerText="SSD" dataField="ssd"/> <mx:AdvancedDataGridColumn headerText="Shipping site" dataField="sss"/> <mx:AdvancedDataGridColumn headerText="File name" dataField="fn"/> <mx:AdvancedDataGridColumn headerText="Batch number" dataField="bn"/> </mx:columns> </mx:AdvancedDataGrid> //xml example from server <batches> <batch> <rd>2010-04-23 16:31:00.0</rd> <mt>SC1REVISION01</mt> <ssd>2010-02-18 00:00:00.0</ssd> <sss>100000009</sss> <fn>Revision 1-DF-Ocean-SC1SUM-Quanta-PACT-EMEA-Scheduled Ship Date 20100218.csv</fn> <bn>10041</bn> </batch> <batches> [/code] and the xml is pretty much displayed exactly as is shown in the example above in the datagrid columns... I would appreciate your assistance.

    Read the article

  • E-Business Integration with SSO using AccessGate

    - by user774220
    Moving away from the legacy Oracle SSO, Oracle E-Business Suite (EBS) came up with EBS AccessGate as the way forward to provide Single Sign On with Oracle Access Manager (OAM). As opposed to AccessGate in OAM terminology, EBS AccessGate has no specific connection with OAM with respect to configuration. Instead, EBS AccessGate uses the header variables sent from the SSO system to create the native user-session, like any other SSO enabled web application. E-Business Suite Integration with Oracle Access Manager It is a known fact that E-Business suite requires Oracle Internet Directory (OID) as the user repository to enable Single Sign On. This is due to the fact that E-Business Suite needs to be registered with OID to for Single Sign On. Additionally, E-Business Suite uses “orclguid” in OID to map the Single Sign On user with the corresponding local user profile. During authentication, EBS AccessGate expects SSO system to return orclguid and EBS username (stored as a user-attribute in SSO user store) in two header variables USER_ORCLGUID and USER_NAME respectively. Following diagram depicts the authentication flow once SSO system returns EBS Username and orclguid after successful authentication: Topic to brainstorm: EBS AccessGate as a generic SSO enablement solution for E-Business Suite AccessGate Even though EBS AccessGate is suggested as an integration approach between OAM and Oracle E-Business Suite, this section attempts to look at EBS AccessGate as a generic solution approach to provide SSO to Oracle E-Business Suite using any Web SSO solution. From the above points, the only dependency on the SSO system is that it should be able to return the corresponding orclguid from the OID which is configured with the E-Business Suite. This can be achieved by a variety of approaches: By using the same OID referred by E-Business Suite as the Single Sign On user store. If SSO System is using a different user store then: Use DIP or OIM to synch orclsguid from E-Business Suite OID to SSO user store Use OVD to provide an LDAP view where orclguid from E-Business Suite OID is part of the user entity in the user store referred by SSO System

    Read the article

  • Publishing SWF using Adobe Flash

    - by Kim
    Hello everyone, I have a SWF file which contains of an image (1keyframe) and also, it contains an AS3 file with the following codes: var loader:Loader=new Loader(); var ur:URLRequest=new URLRequest("1.swf"); loader.load(ur); addChild(loader); so basically, i am trying to play the swf file (1.swf - an audio) while the image is being displayed. What I want to know is how will I be able to publish this project into an SWF file which can still play as expected even without the raw 1.swf file. I can publish SWF right now but when I delete the 1.swf file, my generated swf can only display the image. Help me please. Thanks in advance :)

    Read the article

  • AS3 microphone recording/saving works, in-flash PCM playback double speed

    - by Lowgain
    I have a working mic recording script in AS3 which I have been able to successfully use to save .wav files to a server through AMF. These files playback fine in any audio player with no weird effects. For reference, here is what I am doing to capture the mic's ByteArray: (within a class called AudioRecorder) public function startRecording():void { _rawData = new ByteArray(); _microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, _samplesCaptured, false, 0, true); } private function _samplesCaptured(e:SampleDataEvent):void { _rawData.writeBytes(e.data); } This works with no problems. After the recording is complete I can take the _rawData variable and run it through a WavWriter class, etc. However, if I run this same ByteArray as a sound using the following code which I adapted from the adobe cookbook: (within a class called WavPlayer) public function playSound(data:ByteArray):void { _wavData = data; _wavData.position = 0; _sound.addEventListener(SampleDataEvent.SAMPLE_DATA, _playSoundHandler); _channel = _sound.play(); _channel.addEventListener(Event.SOUND_COMPLETE, _onPlaybackComplete, false, 0, true); } private function _playSoundHandler(e:SampleDataEvent):void { if(_wavData.bytesAvailable <= 0) return; for(var i:int = 0; i < 8192; i++) { var sample:Number = 0; if(_wavData.bytesAvailable > 0) sample = _wavData.readFloat(); e.data.writeFloat(sample); } } The audio file plays at double speed! I checked recording bitrates and such and am pretty sure those are all correct, and I tried changing the buffer size and whatever other numbers I could think of. Could it be a mono vs stereo thing? Hope I was clear enough here, thanks!

    Read the article

  • Flash Double-click an externally loaded SWF

    - by Trist
    OK. I've got a class (which extends MovieClip) that loads in an external SWF (made in pdf2swf). That is added to another class which has declared doubleClickEnabled = true and I'm listening for DOUBLE_CLICK events. Problem is when the SWF is loaded my code picks up no DOUBLE_CLICK events, only CLICK events. I've tried it without adding the SWF to the stage and it does pick up DOUBLE_CLICK events. Anybody come across this before? class ParentClass{ ... public function ParentClass(){ ... mcToLoadSWF = new MovieClip(); addChild(mcToLoadSWF); doubleClickEnabled = true; addEventListener(MouseEvent.DOUBLE_CLICK, doubleClickHandler); ... } } I've also tried adding the event listener to the mcToLoadSWF as well. No dice. Cheers Tristian

    Read the article

  • Real time complex raster image morphing in Flash CS4

    - by cosmorocket
    Is there a way to load an image from some url or a local folder and then make some complex morphing to it in real time? For example, I have a vector animated pseudo 3D paper in my project that, for example, is being flipped different ways. Then I want to place some image inside that paper box and want to morph that image accordingly to the box form changes or at least make it look more realistically, not exactly the same as the paper box. Thanks.

    Read the article

  • Error when I try to installth desktop integration features for Openoffice

    - by PENG TENG
    peng@peng-ThinkPad-SL410:~$ cd '/home/peng/Downloads/en-US/DEBS/desktop-integration' peng@peng-ThinkPad-SL410:~/Downloads/en-US/DEBS/desktop-integration$ sudo dpkg -i *.deb (Reading database ... 357248 files and directories currently installed.) Unpacking openoffice.org-debian-menus (from openoffice.org3.4-debian-menus_3.4-9593_all.deb) ... dpkg: error processing openoffice.org3.4-debian-menus_3.4-9593_all.deb (--install): trying to overwrite '/usr/bin/soffice', which is also in package libreoffice-common 1:3.6.2~rc2-0ubuntu3 /usr/bin/gtk-update-icon-cache gtk-update-icon-cache: Cache file created successfully. /usr/bin/gtk-update-icon-cache gtk-update-icon-cache: Cache file created successfully. Processing triggers for menu ... Processing triggers for hicolor-icon-theme ... Processing triggers for gnome-icon-theme ... Processing triggers for shared-mime-info ... Unknown media type in type 'all/all' Unknown media type in type 'all/allfiles' Unknown media type in type 'uri/mms' Unknown media type in type 'uri/mmst' Unknown media type in type 'uri/mmsu' Unknown media type in type 'uri/pnm' Unknown media type in type 'uri/rtspt' Unknown media type in type 'uri/rtspu' Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Errors were encountered while processing: openoffice.org3.4-debian-menus_3.4-9593_all.deb Can anyone solve the problem?

    Read the article

  • Flash/AIR AS3: comparing contents of Screen.screens

    - by matt lohkamp
    In a sane world, this works as expected: var array:Array = ['a','b','c]; trace(array.indexOf(array[0])); // returns 0 In an insane world, this happens: trace(Screen.screens.indexOf(Screen.screens[0])); // returns -1 ... if Screen.screens is an Array of the available instances of Screen, why can't that array give an accurate indexOf one of its own children? edit - apparently to get back to the world of the sane, all you have to do is this: var array:Array = Screen.screens; trace(array.indexOf(array[0])); // returns 0 Anyone know why?

    Read the article

  • Flash AS3 - XMLList - List items with number of times they occur

    - by Dale
    I have an XMLList 'Keywords', which consists of about 30 XML items. I want to count the number of unique keywords in the List, and how often they occur. Then display the top 3 most occuring keywords. There's probably a simple sorting/count function to carry this out, however, i'm quite new to as3 so please forgive my naivety. Cheers.

    Read the article

  • Flash CS4/AS3 Dynamic Text Box

    - by Jono
    Hi, I have created a XML image gallery, which displays text in between each slide. Now I have created a movie clip with a dynamic text field (with Render HTML selected) to display the text from the XML which is pushed into an array. Now, this all works great BUT... /n or /r is not creating a new line break (as they need to be custom). Yet if I create an Array and manually push strings "Bla bla bla /n bla bla bla" I get a line break. I have tried converting the Array item to string (even though it already is), I would also avoid creating textField = new textField() any Ideas would be welcomed. Cheers

    Read the article

  • Flex/Flash 4 datagrid literally displays XML

    - by Setori
    Problem: Flex/Flash4 client (built with FlashBuilder4) displays the xml sent from the server exactly as is - the datagrid keeps the format of the xml. I need the datagrid to parse the input and place the data in the correct rows and columns of the datagrid. flow: click on a date in the tree and it makes a server request for batch information in xml form. Using a CallResponder I then update the datagrid's dataProvider. [code] <fx:Script> <![CDATA[ import mx.controls.Alert; [Bindable]public var selectedTreeNode:XML; public function taskTreeChanged(event:Event):void { selectedTreeNode=Tree(event.target).selectedItem as XML; var searchHubId:String = selectedTreeNode.@hub; var searchDate:String = selectedTreeNode.@lbl; if((searchHubId == "") || (searchDate == "")){ return; } findShipmentBatches(searchDate,searchHubId); } protected function findShipmentBatches(searchDate:String, searchHubId:String):void{ findShipmentBatchesResult.token = actWs.findShipmentBatches(searchDate, searchHubId); } protected function updateBatchDataGridDP():void{ task_list_dg.dataProvider = findShipmentBatchesResult.lastResult; } ]]> </fx:Script> <fx:Declarations> <actws:ActWs id="actWs" fault="Alert.show(event.fault.faultString + '\n' + event.fault.faultDetail)" showBusyCursor="true"/> <s:CallResponder id="findShipmentBatchesResult" result="updateBatchDataGridDP()"/> </fx:Declarations> <mx:AdvancedDataGrid id="task_list_dg" width="100%" height="95%" paddingLeft="0" paddingTop="0" paddingBottom="0"> <mx:columns> <mx:AdvancedDataGridColumn headerText="Receiving date" dataField="rd"/> <mx:AdvancedDataGridColumn headerText="Msg type" dataField="mt"/> <mx:AdvancedDataGridColumn headerText="SSD" dataField="ssd"/> <mx:AdvancedDataGridColumn headerText="Shipping site" dataField="sss"/> <mx:AdvancedDataGridColumn headerText="File name" dataField="fn"/> <mx:AdvancedDataGridColumn headerText="Batch number" dataField="bn"/> </mx:columns> </mx:AdvancedDataGrid> [/code] I cannot upload a pic, but this is the xml: [code] 2010-04-23 16:35:51.0 PRESHIP 2010-02-15 00:00:00.0 100000009 DF-Ocean-PRESHIPSUM-Quanta-PACT-EMEA-Scheduled Ship Date 20100215.csv 10053 [/code] and the xml is pretty much displayed exactly as is in the datagrid columns... I would appreciate your assistance.

    Read the article

  • Flash As3 Mute Button problems

    - by Lee
    Hey guys, I am trying to create a UI movie clip that can be used across different scenes. It uses variables from the root scope to determine states. When i press the mute button is works fine, however when i try to un-mute things go weird. Sometimes it takes 2 clicks to unmute, sometimes more. It seems random. Muting however seems to work first time.. Any ideas? Main Timeline: var mute:Boolean = false; var playerName = "Fred"; function setMute(vol) { var sTransform:SoundTransform = new SoundTransform(1,0); sTransform.volume = vol; SoundMixer.soundTransform = sTransform; } function toggleMuteBtn(event:Event) { if (mute) { // Sound On, Mute Off mute = false; setMute(1); ui_mc.muteCross_mc.visible = false; } else { // Sound Off, Mute On mute = true; setMute(0); ui_mc.muteCross_mc.visible = true; } } ui_mc Action Script: if (MovieClip(parent).mute == false) { muteCross_mc.visible = false; } mute_btn.addEventListener(MouseEvent.CLICK, MovieClip(parent).toggleMuteBtn);

    Read the article

  • Can Flash AS3 listbox data contain variable info?

    - by Anne
    I'm populating a listbox like this: dp.addItem( {label:"red dress", data:"OV4MP/23OL.swf"} ); Instead of data:"OV4MP/23OL.swf", I would like to make part of the data file name a variable from a dynamic textbox named centerPt that belongs to the parent movieclip, so I did this: dp.addItem( {label:"red dress", data:"OV4MP/23"+MovieClip(parent.parent).centerPt.text+".swf"} ); When I trace the selectedItem.data using: trace("you have selected: "+ overlays.selectedItem.data); trace(MovieClip(parent.parent).centerPt.text); I'm getting: you have selected: OV4MP/23.swf. What I should get is OV4MP/23OL.swf. It is not picking up what is in the dynamic centerPt.text field which are the letters OL eventhough that text field is tracing correctly. Is it possible that data can not hold a variable? Thank you in advance for any help. Anne

    Read the article

  • Flash - Ease out scrolling moviclip, scrolled by sensor on each side

    - by webfac
    I am hoping someone can shed some light on this issue for me... I have a movieclip that is scrolled by means of a 'sensor' on each side of the stage. The clip scrolls fine in both directions, however here is my problem: When the users mouse leaves the stage, the movie clip stops dead in it's tracks, and this does not provide a noce smooth effect. Looking at the code below, is there any way I could tell the animation to ease out when the users mouse leaves the stage rather than simply stop suddenly? Much appreciated! class Sensor extends MovieClip { public function Sensor() { } public function onEnterFrame() { // var active:Boolean; // is mouse within sensor? if ((_level2._xmouse >= this._x) && _level2._xmouse <= (this._x + this._width)) { active = true; } else { active = false; } if(active) { // which area of the sensor is it in? var relative_position:Number; var relative_difference:Number; relative_position = _level2._xmouse / this._width * 100.0; //_level2._logger.message("relative position is " + relative_position); if (!isNaN(relative_position)) { // depending on which area it is in, tend towards a particular adjustment if(relative_position > _level2._background_right_threshold) { relative_difference = Math.abs(relative_position - _level2._background_right_threshold); //relative_difference = 10; } else if(relative_position < _level2._background_left_threshold) { relative_difference = Math.abs(_level2._background_left_threshold - relative_position); relative_difference *= -1; } else { _level2._background_ideal_adjustment = 0; } var direction:Number; if(_level2._pan_direction == "left") { direction = 1; } else { direction = -1; } _level2._background_ideal_adjustment = direction * (relative_difference / 10) * _level2._pan_augmentation; //_level2._logger.message("ideal adjustment is " + _level2._background_ideal_adjustment); if (!isNaN(_level2._background_ideal_adjustment)) { // what is the difference between the ideal adjustment and the current adjustment? // add a portion of the difference to the adjustment: // this has the effect that the adjustment "tends towards" the ideal var difference:Number; difference = _level2._background_ideal_adjustment - _level2._background_adjustment; _level2._background_adjustment += (difference / _level2._background_tension); // calculate what the new background _x position would be var background_x:Number; var projected_x:Number; background_x = _level2["_background"]._x; projected_x = background_x += _level2._background_adjustment; //_level2._logger.message("projected _x is " + projected_x); // if the _x position is valid, change it if(projected_x > 0) projected_x = 0; if(projected_x < -(_level2["_background"]._width - Stage.width)) projected_x = -(_level2["_background"]._width - Stage.width); _level2["_background"]._x = projected_x; // i recommend that you move your other movieclips with code here } } } } }

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >