CounslerCoffee
Registered Senior Member
Tiassa I just got certified for .NET
It's a basic difference of philosophy. For instance, why is there a 56k modem on an iMac? Because no matter how badly Apple would like everyone to have broadband, they aren't powerful enough to make it so. Using a Mac without a network is a little like using a gun without bullets.Well, no network, no help. If you want to transfer something between computers that aren't on a network, say... at another house, then you're out of luck?
Despite the amount of cooperation between Netscape/AOL and Apple, I cannot see why anyone would want AOL to be their primary internet access. This is, admittedly, a personal opinion, but in terms of "Where do you want to go today?" are you going to answer that question yourself or let AOL answer it for you?No internet and you need a disk. What if you don't want to switch on AOL (plausable for many mac users)?
It was part of their expectation, but FireWire, Rendezvous, and the seemingly-conspicuous lack of a floppy drive all testify to Mac's optimization for a network. There are also CDR's, CD-RW's, and other technologies available. Hell, my 100-gb hard drive is designed to travel. Furthermore, why waste a floppy drive on, say, notebook computers?I'll tell you that Mac didn't expect to use networks instead of floppy disks. They EXPECTED zip disks to REPLACE the floppy disk. THAT is the reason. Zip disks are too expensive to be good for convnetional use like that.
Tell it to the software designers. You know, people like Bill Gates, who, despite the advances in processor speed don't provide much faster a user experience because it's designed to be larger instead. You might ask any software manufacturer who sells a product before it's ready why it needs updating. Among Apple updates are frequently "Carbon Library" updates, which allow software manufacturers who don't want to program Apple-native applications to port their products to the platform. One of the reasons we update is to accommodate the software manufacturers.I doubt double clicking on a shortcut is a user error. A computer should not crash because it thinks it needs upgrading. An old computer should work just as well now as it did before.
Ask the software manufacturers. I don't know, ask Microsoft. It's not like they haven't had to patch a plethora of security failures in their operating systems.That is a terrible excuse. Why does something need to be upgraded?
That is a terrible excuse. Why does something need to be upgraded?[/b]Ask Adobe, or Microsoft, or even Omni Group--a rising OSX producer--why you need to update.
Incidentally, you seem to have confused the terms "update" and "upgrade". Since Microsoft needs to put out a new version of their operating system to handle broad updates, I can understand. But that's the nifty thing about Carbon update for Apple users. If something is innovated, the OS can be updated to handle a brand-spanking new concept.Windows users should try having ideas from time to time.How profound. The idea sucks in that case.
Apples are not designed merely to operate as tools. Apples are also designed to help inspire their users. It is much easier to get work done if you're sitting in front of an OS and a computer that is harmonious. That is, what to do today? Record an album? Edit a short film? Oh, shit ... I'm a Windows user, so I'll spend the day downloading drivers.
Microsoft is about financial efficiency--namely theirs. Apple is about working efficiency, namely the users'.What's the use of being a petulant, presumptuous child? It's the fifteen years of using Microsoft products after learning to operate multiple platforms that finally compelled me to get an iMac. It's a decision I have not regretted.Have you ever actually used PC systems for more than 5 minutes at a time? I kinda doubt that trying to fix someone's system with 0 knowlege about it is going to prove anything.That's why I remember OS's like "WinCE", and why it was aptly named. It's more about politics and money with Microsoft. Like Gates wanting MIE to update on Macs not as an application update, but as part of the system update, in order to foster his "browser-as-part-of-operating-system" pitch. Sadly, MIE does not integrate well with the operating system, generally ignores it, and because it's X version is a mere port, it still suffers memory-management problems because it is not designed to allow the OS to handle memory management the way the OS is designed to. Maybe if Gates was more interested in delivering than yammering, he wouldn't be viewed as such a joke.Windows is made TO be customizable, to fit with the users needs.Lesson time:I'll admit that their organization for customizing sucks terrible ass, but I looked at OSX and its much a copy of windows organization.
• The 1980s and 1990s GUIs were all ripped off from Xerox
• OSX has existed in various developmental forms for at least 10 years prior to its launch; and here's the thing--most or all of those forms were functional.
• Windows XP is a visual rip-off of OSX.
• Microsoft generally follows where Apple leads.
It's silly assertions like OSX being a copy of Windows organization that reminds me so clearly of the dangers of the "Gospel of Bill".
Keep dreaming. I saw OSX function flawlessly, literally with no restarts except for product updates, in beta form, for a whole freaking year before anyone outside Microsoft heard the phrase "XP". We all laughed when we heard about XP; it still seems to be "Xtra Porky".Macs are designed to operate at their best coming out of the box. Unlike Windows machines, you don't have to spend time begging the thing to deliver.Mac is made to be untouched, messing with things degrades it, thats so dumb. The "best" is an opinion, whats best for one is worst for another.It's very simple. I intend to go through the affected applications and merely check their preferences. I'm willing to guarantee the problem is in there. Now, how easy that process is will tell me much about whether or not Microsoft has learned anything while following Apple around, or if they're still just in it for the money.How do you have any way to compare the magnitude of the problem to the mac systems.Maybe you're stupid enough to write that situation up as a software error.If she hit the computer with a sledge hammer, saying you can't fix it doesn't prove the system is bad now does it?
I'm not.
But of course, I'm an Apple user, and that may have something to do with why I can tell the difference.
A curious thing: With three professionals and over a decade of my own use of PC's to be accounted for, I once tried building a system. It didn't work. Seemed like something I was doing. My Windows friends, including the professionals (and one whose job it is to actually assemble computers) told me it was something I was doing wrong. So I said, "Fine, let's go out and get what we need. You're all here and you tell me I'm stupid, so let's go get this stuff with your advice and put it together."
I have never heard "manufacturer error" so many times in any process. Turns out it wasn't me; the parts these professionals advised me to buy did not work well together as they had speculated. By the time I was done, my preference for Windows was shot, my patience with non-standardized hardware was shot, and I'd spent enough money to buy three iMacs. If it had just been me, I would have believed it to be my own problem. But when three professionals cannot select gear, get it home, and get it put together without setting something on fire or sitting for hours saying, "It has to work. It's supposed to work!" well?
That's the thing about the platforms. If something's wrong with Windows, it must be the user. If something's wrong with the system, it must be the user. Turns out, whatever's wrong is also wrong with the professional users, so that seems to be a problem. Barring the statistically-necessary hardware failure (of which I suffered one), Macs simply don't have this or any similar problem. I accept the statistical necessity of hardware failure, but it shouldn't be so hard that well-paid professionals can't even do their jobs.
It was insane. I wasted so much money on a PC that I could have bought three Macs.
Incidentally, I just got a new digital camera for pictures of my coming child. I noticed something about it though. It doesn't have a base. That is, I once bought my brother a digital camera, thinking, "Just plug it in." My mistake. You can't do that on a PC. But he's not your usual Windows idiot who can't figure out to download a driver or update a piece of software. After a few hours he could make the camera work very well on the PC. My Mac? It took ... four seconds. But since my camera also plugs in with a USB cable that easily, I'm curious about these people that have to put their cameras into a plastic base to effect data transfer. What, pray tell, is the purpose? Is it just that Windows-related idea of proprietary rights, so that no piece of hardware can work with any other without an extra piece of hardware acting as an intermediary? Maybe it is a Mac thing, too, but I have yet to see a Mac user with a bunch of extraneous hardware littering their work environment. There's just something about a computer that works the first time.
Ironic moment: My associate arrived, and I asked what the problems are before we head out to see the computer in question: I tried to burn a CD, and it didn't work. I don't know what I did wrong.
I'm sorry, but this is foreign to me. I just tell my computer to burn a CD, and it burns. What's funny is that this isn't what I was expecting to deal with. Should be fun.
thanx,
Tiassa
PS ... Sorry about the length. I know it's tough. Do your best.
Well, I still point out the difference between update and upgrade. But you're now welcome to ask that question of Microsoft.That is a terrible excuse. Why does something need to be upgraded?
I put these two quotes together so I could ask one facetious question for effect:• I have never had any problem with drivers for anything ever except for a printer that I didn't have the instalation software for
• The thing didn't come with instalation software? Strange.
I've come to prefer USB, but we have cable. I like the bandwidth of cable, but AT&T can't provide reliable service. Oh, well.The company went out of bussiness and was bought by another company; needless to say the problem is over and I have gotton 100 kb/sec since then.
If it's reasonable, get a patch cable. If not, use the internet. PC users do not have the excuse of a "stand-alone" system, since after all, your Internet Explorer is "part of the operating system". A Windows box is supposed to be on a network, or else Bill Gates lied under oath. While I'm certain the latter is the case on a number of things, this one's so critical to Microsoft's position that they have to deliver: PC's are designed for networking.I happen to use floppy disks a lot. I remember wanting to transfer somthing onto or off of that Imac and couldn't because it didn't have a floppy disk thing, standard everywhere exept on a mac.
It's pretty hard with a Mac. You have to do something like ... well, here, let me take a look for a minute at my system.No, I didn't touch it, but I fail to see how changing a few setting could make the computer freeze...
Opinion, I'll grant. But product quality sometimes demands aesthetic sacrifices. It's one of the things Apple has worked to fix.That is way too much opinion to argue against. Old macs were the most ugly computers I have ever seen.
Which is funny because Mac users all laughed when we saw XP. Whenever I sit down in front of XP for anything, all I can think of is a line from Ren & Stimpy's "Space Madness"--The shiny, candy-light button!.The new ones are filled with eye-candy that I don't appreciate as much as others might.
Yes, seeing that much magenta and pastel-yellow on anyone's desktop ought to be illegal. Customize the visuals of the OS? Yeah, I know. What's funny is that in ... a decade (at least) of using Windows, I never found a visual scheme I liked. Turns out it wasn't the colors, it was the whole appearance.Windows systems have long had the ability to customize the visuals of the OS.
Well, as you pointed out, the old system was black & white. But it seems to me that in OS 9 I could change the appearance of my desktop. What I couldn't do, of course, was make my application title bars display in 72-point chartreuse characters. I always wondered about that with Windows; it seemed a waste of memory.Only new macs have comparable visual customizablity.
Do you remember Windows 3.x?I can agree or at least I won't contest that windows stole original ideas from Apple, but I would like to respectfully disagree with the above quote. I have seen that windows has far surpassed mac in the area of visual stimulation until recently.
That's part of the point. This is a new operating system. Just as we got nearly 20 years out of the old OS, the new is designed to grow and change to meet our needs. Like I said, Apple isn't just a product, it's an idea.Windows looks much more like older versions than mac's OSX looks like its predecessors.
That's a shallow version of competition. Apple wants to give you a working product. Microsoft just wants your money.Thats the way it is with competition, each side wants to better the other and in many cases this makes both products more and more similar with each side stealing the other's good points.
I don't see the issue. Give me an example.As I have said, making the computer customized to the way you like it should in no way impede the computer's usefulness.
Well, it does depend on your needs. But Apple aims to meet them. I don't see the issue, really. I see it in theory, but it's not a question with a Mac.The nature of computers is that there is not "best", each system works well for something and not as well for others.
Now that is weird. My CD-burner is by AcomData. It was a gift. While I appreciate it, I find it quite interesting that none of the Mac OS software worked. Matter of updates, of course, but they chose not to update their software. Oh, well. It works anyway. I just find it hilarious that some of the software listed on the box, in the documentation, and on the CD-face itself was not included. Of course, the person who gave it to me couldn't be bothered to go to an Apple store. Despite the number of drivers he has to download despite product software, he still thinks that everything should work the first time.For some reason, noone has heard of software that can delete (no not overwrite) a whole cd in like 2 minutes... It came with mine.
So does Apple. So do I.I do wonder why every peice of equipment needs software.
That's part of what Apple is trying to fix. But what's funny is that people complain that Apples aren't hard enough to use.Since computers are mady by programmers, they make it easy for them to make, easy for them to use, hard for everyone else.
Yes, three letters. DSL ... duh.You realize "USB" is not a form of internet...
• Burn a CD/RW if you must. But if you insist on using a Mac in a manner it's not designed for (e.g. stand-alone) you have to be prepared to do something like get a zip-disk. But if I really need to transfer something to to a computer I can't access by network, I can use a CD/RW.If I, say, wanted to take something on a disk, and I did not know what computer I would use, then what is the mac solution? If I want to take it to a computer at my school, what do I do (lets assume they have macs)? If a computer doesn't have internet and you don't want to string a mile-long cable between them, what then?
I would kindly propose that you explain that to Bill Gates, who has testified before Congress that computers are designed to be networked in order to foster his "Browser is the operating system" argument, in order to excuse himself for treating consumers and businesses poorly. Win98, at least, but if you use Win2k or WinXP, your OS is designed for networking.PC's are designed for numbercrunching. That is why everything else may seem a bit disorganized, because everything is based around numbercrunching. Any networking stuff is an add on that is effectively not incorperated into the design.
Tell me about it ... my OS is dressed-up, user-friendly Unix.Computers work off of 20 year old ideas, mac or PC.
Come on. Weren't we just, a couple posts ago, talking about preference settings? Jesus H. Baldheaded Freaking Christ ... I don't know what to say if you've forgotten that already.Well, any input to the computer can do ... anything. If your definition of a user is "one who inputs the information", then yes the user is at fault.
That was largely a Windows problem for the users. If it's a problem for Macs, we avoid it by ... being on a network so we can update our systems.Yet there are litteraly thousands of ways a computer modefies itself and this is one of older OSs problems, it is why problems compound over time.
And the rest of the market followed. Operating system shells, computer cases, even the George Foreman Grill for criminy sakes!I remeber seeing Imacs for the first time. What does the blue plasticy stuff remind you of? They tried to be WAY too artistic, what with the round mouses and pretty colors.
This answers a lot of questions for me, then. Microsoft sort of started the "GUI Wars". After everybody stole their ideas from Xerox, a number of computers had desktops that looked like the Mac's. The Atari ST's, the Commedore Amiga, as I recall. But Microsoft wanted to be different. The horrifying culmination of that effort was called Windows 3, which, while it had its merits for being solid, was not particularly intuitive or otherwise friendly. All because they wanted to be different. That's why Apple needled Microsoft over Win95. Microsoft was only ten years late getting to the appearance, and they never did get it entirely right."Do you remember Windows 3.x?"
Never seen it.
Um ...Thats the way I view macs... You seem to contradict yourself when you said that windows offered only sickening color schemes, yet now you cite candy-lights and impressive shiny bells and whistles?
Well, be sure to thank them the next time you buy a music CD, or watch a DVD; there's a reason the people other people want to be--singers, writers, directors--use Macs. Because they work. People talk about Apple's small market-share. But they don't realize how important that market-share is to their lives.The simple people are the ones using macs.
Well ... is your computer just for "using"? What I'm after here is that people have computers for various reasons. My computer is not just a tool, it is a part of my life; a Y-connector hooks it to my stereo so I can hear any of the seven days' worth of music I have on a hard drive, It is part of my communication system. It is part of my working life. It is part of my upcoming parental life. It is a tool, a resource, and can, should I ever decide to learn how, be used as a weapon. It won't wash my dishes, it won't blow the air around like a ceiling fan, but I can live with that. Besides, give Steve Jobs a couple more years and the damn thing will wash my dishes and run my environment. Now, I'm not talking the customized way that can be done right now. I'm talking about as an expected part of my software suite, so that you can take your Mac out of the box and have it run your HVAC. It's coming. We're patient. Like with .Mac; we know it's not a straight answer to .NET, but at least ours is in place, ours works, and by the time .NET can be fully implemented, it will be taking its cues from Apple, just like XP did, just like Win95-2k did. What is your computer for? Seriously, if you just want a boss game deck, fine ... get the $400 Gateway. It's the smarter choice. If you want just an e-mail computer, get the cheap ePC. Need something for the kids to write a report on? Go on, dude, get a Dell. But I still think of that dumb slogan: "Where do you want to go today?" Microsoft has never delivered on that.I think of user friendlyness as easy to use. Mac people think of it as easy to learn. I'd rather spend 20 minutes learning how to use my system than have a system that takes 1 minute to learn and an excess amount of time to actually carry out what you want to do.
You'll have to give me some examples. Really. I haven't a clue what you're talking about because I cannot honorably presume that your thinking would be as wrongly-oriented as it seems. Also, I would prefer examples because if I presume what you're talking about, and you feel that is somehow beneath your intellectual level ... so help me out a little here, because there's a couple of places I can think of where you'd be flat wrong.Windows has all sorts of organizing shortcuts and alternate ways of working, I haven't seen any such pattern in mac systems.
Well, more and more game developers are coming over to use Apples. Oh, wait ... I forgot--the G4 is a Supercomputer. Sorry ... but that's such old news to us Apple users that I completely forgot about it.What do you think creates all those neat little office work representations on your screen or those flash game icons? Work and Games require a ton of number crunching, supercomputers are made for the numbercrunching power of maybe 1000 normal computers.
Well, duh! All computer crunch numbers, and do nothing else. But the PC system/development has been driven by games and office applications for 15 years. Last year the computer gaming industry pulled in 550 billion dollars. Games and office applications are driving PC devlopment. Specific jobs are being done by those CPUs. They are not designed for high-powered number crunching. This is why the designs are going moldular, with each component having its own purpose-built processor and memory. For dedicated number crunching, you use supercomputers.Originally posted by Frencheneesz
What do you think creates all those neat little office work representations on your screen or those flash game icons? Work and Games require a ton of number crunching, supercomputers are made for the numbercrunching power of maybe 1000 normal computers.
Originally posted by tiassa
My computer is not just a tool, it is a part of my life; a Y-connector hooks it to my stereo so I can hear any of the seven days' worth of music I have on a hard drive, It is part of my communication system. It is part of my working life. It is part of my upcoming parental life. It is a tool, a resource, and can, should I ever decide to learn how, be used as a weapon. It won't wash my dishes, it won't blow the air around like a ceiling fan, but I can live with that. Besides, give Steve Jobs a couple more years and the damn thing will wash my dishes and run my environment.
thanx,
Tiassa
Perhaps what is needed here is an exploration of "religious methods". As I see it, the primary religious method is revelation and perhaps we can work in symbolism as well. What would you define as religious method?Originally posted by Tiassa
We see on an intellectual level that the precepts of Christianity can have a vast effect on how people view the world and consequently what equals proper conduct. But in the terms of poor law and welfare--Spartanism is more practical, but is it preferable?
But the reasoning was there and it was illuminated by some of the earlier thinkers such as Aristotle. It just kept getting overridden by religious interpretation.Am I, for instance, the first person to passionately believe that the protection of the weak actually helps the species? Perhaps, but I don't think so. On the other hand, why didn't anyone say it this way hundreds of years ago? Because there was never any reason, aside from God's will, to become charitable or philanthropic.
Objectivity has it's limits. At some point there must be a transition between objectivity and subjectivity. I find our tendency to see dualism as a binary set to be partially to blame here. Objectivity/Subjectivity is seen as an either/or proposition. But there does need to be a congruency between the two. At some point this may simply be a matter of focus, which is fine. However, I do think that you have a tendency to see objectivity as mechanistic... which, IMO, is a paradigm that quickly falls apart when forced upon humans.However, I've been rocking the boat today comparing the atheistic objectivity toward God and the notable lack of such objectivity elsewhere in life.
I do find this problematic, on both sides. To wit, the Christian struggle to come-to-terms with non-Christians who live good lives and/or do not know about Christianity. The loopholes invented to justify the obvious contradictions would put a lawyer to shame. Yet, because of the paradigm they cannot see the issue for what it is.As long as we continue to separate religion from other ideas that have similar divisive results, we will continue to treat a common idea as foreign to itself, and never really get to know it.
But then it's not literal... it's interpretative....the same phrase has two different values to people. Even if we take this literally.
And therein lies the difference. The Rorschach test acknowledges that it's interpretative. There is no single correct response. Only an average, a trend or tendency, and a relationship between the answers and the mind giving the answers. I see this as a problem in the realm of science as well... only rarely is the fact that we are dealing with models brought up for discussion. It is the relationship being explored that is the truth of the matter, not the specifics of the model.Now the wholly internal literalism will still assert a "virgin birth". But what it won't assert is that the "virgin birth" is a fictional, psychosocial metaphor offered to signify some deep Freudian conscience. It's kind of like looking at a Rorschach. What we see are ink blots. What they look like is a different thing altogether.
You wrote, "What I don't think you addressed properly is the flexibility of religion. A simple historic observance demonstrates a radical shifts in Christian principles. " ... And yet now we find you leaning away from that flexibility: I see a strong tendency towards perpetuation of such bias.
...the reconciliation of these two ideas is not as simple as it would seem they should be. I submit that I have properly addressed the flexibility of religion, and that what I am telling you does to a certain degree reconcile the "radical shifts" with the inflexibility that tends "toward perpetuation of bias".
You're dealing with at least two sets of ideas: the actual dogmatic foundation itself, and what people think of it.
Thanks for pointing out my contradiction... I'm not being clear enough. But I do find the apparent contradiction to be resoluble. Flexibility is apparent from a societal/historical viewpoint where we see the shift in principles over time. Meanwhile, taken at any single point in time we find a particular set of declared absolutes. Thus examining any particular set of interpretations we find them generally to be static until reality or societal change stretches them to the breaking point and they are forced to adapt or shatter. It is the individual adherence to belief and the institutional nature of our various educational systems that are largely to blame. We are still locked into a Newtonian paradigm where facts are supposedly immutable, where reality is thought to be concrete and inflexible. Frameworks build upon such foundations are all to liable to shatter upon change.
I find that I must maintain a flexible position... that my reference must be relational rather than fixed.
What is amazing is how, when a fact regarding a changing system is brought up, how quickly the issue is avoided. Bring up the Crusades, the Inquisition, witch trials, cross burning, and the modern Christian answer is almost always refers to a "false" interpretation. Yet I find no basis upon which a "true" interpretation may be founded. It is always interpretative.
Heh... if I knew you'd be this happy about it I could have stated it at the beginning.You've got it.
Exactly!
Ring the bells, roll out the trolleys, call Icarus Montgolfier Wright and Christus Apollo! Go tell it on the mountain!
(I really am this excited ....)
And now you know approximately what "core" the Sufis are seeking, and thus why the idea of accretions is so important to me.
In this sense of God, God and knowledge are both. Zen philosophy has called it Beginner's Mind ("In the beginner's mind there are many possibilities, in the expert's there are few.")Many people take their faith in God as a starting point
...
Some religious people are smart enough to take God as a future goal.
...
In a similar sense, knowledge is not a starting point. It is for further explorations, but analogously knowledge cannot be presumed, it must be attained.
God is that which is greater than anything we can imagine.
Nobody can tell me what God is. The whole of what the term equals to people exceeds the physical capabilities of my individual brain. The whole of what it can represent exceeds the capabilities of my entire lifetime. God is works well as a phrase because it attempts to describe nothing else. It is, of course, not entirely accurate since God also represents those things which Are Not. But functionally, God is stands as a statement I simply cannot deny. Combined with an acceptable definition of God, it is dishonest for me to maintain my former atheism.
That exists which is greater than can be imagined. Defined as such I don't know that it can be denied. Which I see as somewhat problematic in itself.
Definitely. Or the manipulation of terms in such a way that it forces a conclusion.I'm well and fine with that. I think there's a difference between a beneficial paring of factors and the narrowed-for-sound-bite routine.
While I agree with the potential due to the lack of limits I think that most of the current measurable difference is symptomatic and I feel that it will grow smaller as atheism becomes more predominant. To wit, most of the world's population is religious and most societies have a strongly influential religious aspect. Therefore atheists tend to be the type of people who "think outside the box" who challenge the traditionally accepted beliefs... these people tend to be more intelligent.Atheism, for instance, has a higher intellectual potential than something that dogmatically limits its field of inquiry. We have, in the past, argued that atheists are smarter. I actually tend to believe this, but that's my own problem.
Guns are designed to kill things. When they kill the wrong thing or used for the wrong purposes, they're problematic.
Likewise labels. Humans classify and enumerate in order to identify and establish a relationship with a thing in the Universe. To apply classifications wrongly is problematic.
But it's not the gun or the label that's problematic... it's the use of them. Lenny Bruce had an infamous routine that I'm just itching to post but I'm fairly sure that it would be removed despite that his intent is the exact opposite to the reason it would be removed. Here is a link: http://www.wam.umd.edu/~molouns/amst450/delillo/1950s.html
I agree and find it disingenuous. At some point the atheist is simply begging the question. As above it's a manipulation of terms set in order to force a conclusion. But there is also a strong contingent of Christians here. Addressing them within their own paradigm is proper in this respect.Much of the "narrowing" going on is an effort to avoid discussion, and seems designed to haul down the basis for communication.
But I don't find that to be necessary. I have not experienced anything like it. If anything, I find myself more capable of adopting various perspectives for various reasons. I think the problem lies in trying to apply a single point of reason across the board or a single method across paradigms.I've said a lot of things about atheism in my time, but its biggest betrayal, the reason for my outright distaste, is that it further narrowed my perspective of the world.
I agree with the statement that people are not refining, however, I disagree that they're taking a more artistic direction. In short, I find that refinement is the essence of art. What I see as problematic is that we're moving away from this. We're more concerned with easier and faster rather than better. This is particularly true in the Western experience. I mean, I'm all for getting my fast food quickly but when I find myself thrilled to find my fries are actually hot when I get them I have to wonder at the standards we're setting for ourselves.Here's the thing: people aren't refining. Have you noticed that philosophies are getting more simplistic, broader in their vision, and generally weaker and more fanciful? Have you noticed that educational standards are slipping, that the truly bright performers are becoming more rare in important arenas? I'm happy that people are taking a more artistic direction, but a nation of filmmakers cannot grow wheat.
Similarly, I'm appalled and enraged at the influence of the psychiatric community. The norm has become the apex of our culture. We vigorously beat down any expression outside the range of normalcy. I mean, come on, ADD in pre-teen children? As opposed to what? My very definition of a child includes attention deficit. Any child that doesn't flit from subject to subject is either idiotic or drugged. Part of growing up includes learning the skills of concentration and focus... of pursuing refinement and excellence. Calmness in chemical form is none of these things.
But we're not focused upon survival here in the US. We're focused upon immediate gratification. Quantity, not quality is our standard. Personally, I think that most of people would be better off living under the constraints of subsistence farming. They'd have less time to fuck things up and less imaginary problems if they were dealing with real ones.When the most part of what people think about is survival, is keeping up, then we're not aiming for progress.
Amen!It all feels and smells and tastes like the same brand of idiocy, everywhere I turn.
But Tiassa, it's too much a product of our culture to be anything else. I've learned to appreciates those occasion where we rise above the common babble but most of the time it is beyond a public forum such as this.... there is a leveling affect at work.And here at Sciforums we have a chance to do whatever we want, but included in that whatever is the chance to figure out the things that confuse us. That's why I find it so sad that Sciforums is in style and outcome very much a microcosmic representation of reality. We have a chance to escape that microcosm and forge new territory, and hopefully spill that spirit back into our daily lives.
I also find this all too common. I think it's a natural tendency and something each of us needs to work against in ourselves.There is a presumption that the self cannot be wrong, and any deviation is intentionally hostile toward the self.
Actually, I think the primary problem in Western culture is our use of leisure time. The large majority uses leisure time strictly for playing. How many acquaintances, not friends that's not a random sample but acquaintances, do you have that read anything but pulp fiction? How many spend the time pursuing a talent that doesn't involve a ball of some kind? The answer will depend partially upon where you live and work but still I think you'd have to say the answer is frighteningly low. I don't think it's primarily that we don't have enough time but that the time we do have most of us spend fucking around. Ultimately, I agree though.Those higher orders do come from the basic. But look at the world around you. People do what they have to, not what they want to or what they're good at. Practically speaking, it's inefficient. Metaphysically speaking, it's catastrophic.
And there you hit it on the head... it's not the necessities that are the problem. It's all those things we "invent" and then declare necessary... but aren't.Human beings have so much potential, yet we worry too much about the "necessary", and invent things to call "necessary" (subjective realities; e.g. religion, nation-identities, &c.).
There's also the adage of the writer or painter who has nothing but his art and yet is happy.There's another joke among writers and musicians: When you're starving, insane, and your dog has left you alone because you suck, you're almost there.
I believe that for the most part we're medicating it and educating it out of existence. But one must also acknowledge that such works were bought at great expense. The tremendous disparity of wealth is largely what allowed certain families to sponsor such art. For all the offenses of the leisure class they make tremendous contributions to civilization but they were paid in blood and suffering.Why is there no new Sistine Chapel? Why is it taking so long for the next Mozart to come along? Whence comes the next generation of greatness?
Actually, I have a particular aptitude for that.Other than another human being, how many tools do you intentionally use for something they're not made for?
I concur. Only I would add games or play to the list of labor, wealth, and greed. During the cold war I often lamented that America watched football while the Soviet Union's favorite past-time was chess. Now, after the collapse... we can all watch fucking football.We raised temples to gods, we paid homage to our food, to our lusts ... do you realize how much of the present generation pays tribute to labor and wealth and greed? Really, I'll take rainbows and muses over that, any day. Do you live to work or work to live? All sorts of simple sloganism works here. It's all a matter of priorities.
It's worse than that... we're actively pursuing something lesser all the time, telling ourselves and each other that it's what we really want/need.I just don't like the feeling that we're settling for something lesser, y'know?
~Raithere